Dec 03 14:06:26 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 14:06:26 crc restorecon[4700]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:26 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 14:06:27 crc restorecon[4700]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 14:06:27 crc kubenswrapper[5004]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.452150 5004 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458127 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458187 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458193 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458198 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458203 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458207 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458213 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458217 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458221 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458226 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458231 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458235 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458240 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458244 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458248 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458254 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458258 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458288 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458293 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458299 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458303 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458307 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458312 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458316 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458320 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458325 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458330 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458334 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458339 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458344 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458348 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458353 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458357 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458369 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458376 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458383 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458387 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458392 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458397 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458402 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458407 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458411 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458418 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458424 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458429 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458433 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458438 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458443 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458448 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458452 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458457 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458463 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458467 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458472 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458476 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458481 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458485 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458489 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458493 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458497 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458502 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458506 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458510 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458514 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458519 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458523 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458530 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458537 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458543 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458549 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.458554 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458892 5004 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458912 5004 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458923 5004 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458931 5004 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458938 5004 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458944 5004 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458952 5004 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458958 5004 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458963 5004 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458969 5004 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458976 5004 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458982 5004 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458989 5004 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.458995 5004 flags.go:64] FLAG: --cgroup-root="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459001 5004 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459007 5004 flags.go:64] FLAG: --client-ca-file="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459012 5004 flags.go:64] FLAG: --cloud-config="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459017 5004 flags.go:64] FLAG: --cloud-provider="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459022 5004 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459028 5004 flags.go:64] FLAG: --cluster-domain="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459033 5004 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459038 5004 flags.go:64] FLAG: --config-dir="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459043 5004 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459048 5004 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459055 5004 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459061 5004 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459066 5004 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459071 5004 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459076 5004 flags.go:64] FLAG: --contention-profiling="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459081 5004 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459087 5004 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459093 5004 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459098 5004 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459106 5004 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459112 5004 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459117 5004 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459123 5004 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459128 5004 flags.go:64] FLAG: --enable-server="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459134 5004 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459141 5004 flags.go:64] FLAG: --event-burst="100" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459147 5004 flags.go:64] FLAG: --event-qps="50" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459152 5004 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459158 5004 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459163 5004 flags.go:64] FLAG: --eviction-hard="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459171 5004 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459177 5004 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459182 5004 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459188 5004 flags.go:64] FLAG: --eviction-soft="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459193 5004 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459198 5004 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459203 5004 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459209 5004 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459214 5004 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459219 5004 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459223 5004 flags.go:64] FLAG: --feature-gates="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459230 5004 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459235 5004 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459241 5004 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459246 5004 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459253 5004 flags.go:64] FLAG: --healthz-port="10248" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459258 5004 flags.go:64] FLAG: --help="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459264 5004 flags.go:64] FLAG: --hostname-override="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459269 5004 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459275 5004 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459280 5004 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459285 5004 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459290 5004 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459296 5004 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459301 5004 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459306 5004 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459311 5004 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459316 5004 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459321 5004 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459326 5004 flags.go:64] FLAG: --kube-reserved="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459331 5004 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459335 5004 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459340 5004 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459344 5004 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459349 5004 flags.go:64] FLAG: --lock-file="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459353 5004 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459357 5004 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459361 5004 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459375 5004 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459380 5004 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459384 5004 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459389 5004 flags.go:64] FLAG: --logging-format="text" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459393 5004 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459399 5004 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459408 5004 flags.go:64] FLAG: --manifest-url="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459416 5004 flags.go:64] FLAG: --manifest-url-header="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459424 5004 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459429 5004 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459436 5004 flags.go:64] FLAG: --max-pods="110" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459441 5004 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459447 5004 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459452 5004 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459457 5004 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459462 5004 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459467 5004 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459472 5004 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459483 5004 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459487 5004 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459491 5004 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459495 5004 flags.go:64] FLAG: --pod-cidr="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459499 5004 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459506 5004 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459510 5004 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459516 5004 flags.go:64] FLAG: --pods-per-core="0" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459521 5004 flags.go:64] FLAG: --port="10250" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459526 5004 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459530 5004 flags.go:64] FLAG: --provider-id="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459534 5004 flags.go:64] FLAG: --qos-reserved="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459538 5004 flags.go:64] FLAG: --read-only-port="10255" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459542 5004 flags.go:64] FLAG: --register-node="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459546 5004 flags.go:64] FLAG: --register-schedulable="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459551 5004 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459558 5004 flags.go:64] FLAG: --registry-burst="10" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459562 5004 flags.go:64] FLAG: --registry-qps="5" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459566 5004 flags.go:64] FLAG: --reserved-cpus="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459573 5004 flags.go:64] FLAG: --reserved-memory="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459584 5004 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459592 5004 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459598 5004 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459603 5004 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459608 5004 flags.go:64] FLAG: --runonce="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459614 5004 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459620 5004 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459626 5004 flags.go:64] FLAG: --seccomp-default="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459632 5004 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459637 5004 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459643 5004 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459648 5004 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459653 5004 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459659 5004 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459665 5004 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459671 5004 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459676 5004 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459681 5004 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459687 5004 flags.go:64] FLAG: --system-cgroups="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459694 5004 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459713 5004 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459721 5004 flags.go:64] FLAG: --tls-cert-file="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459727 5004 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459735 5004 flags.go:64] FLAG: --tls-min-version="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459742 5004 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459748 5004 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459754 5004 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459761 5004 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459767 5004 flags.go:64] FLAG: --v="2" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459776 5004 flags.go:64] FLAG: --version="false" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459784 5004 flags.go:64] FLAG: --vmodule="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459792 5004 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.459799 5004 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459955 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459963 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459970 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459975 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459980 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459985 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459989 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459994 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.459998 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460003 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460007 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460012 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460017 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460021 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460025 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460029 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460034 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460038 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460043 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460047 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460052 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460057 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460061 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460065 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460070 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460074 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460078 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460082 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460087 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460091 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460095 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460101 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460107 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460112 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460117 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460122 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460127 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460132 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460140 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460150 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460155 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460159 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460163 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460169 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460173 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460178 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460182 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460186 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460190 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460195 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460200 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460204 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460209 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460214 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460220 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460225 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460230 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460235 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460238 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460242 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460246 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460249 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460253 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460256 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460260 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460264 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460267 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460271 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460275 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460279 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.460282 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.460290 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.467916 5004 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.467968 5004 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468036 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468044 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468049 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468053 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468056 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468060 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468064 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468068 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468072 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468076 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468079 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468083 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468087 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468090 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468094 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468098 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468105 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468109 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468114 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468118 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468121 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468125 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468129 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468133 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468136 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468140 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468143 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468147 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468152 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468156 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468160 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468164 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468168 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468172 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468177 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468181 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468185 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468189 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468192 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468196 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468200 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468204 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468207 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468211 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468215 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468218 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468221 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468225 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468229 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468233 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468237 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468240 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468244 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468248 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468251 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468256 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468260 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468264 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468267 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468271 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468274 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468278 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468281 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468286 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468290 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468295 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468299 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468309 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468315 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468321 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468327 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.468336 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468738 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468750 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468755 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468760 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468765 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468769 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468773 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468778 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468782 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468788 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468792 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468796 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468801 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468806 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468810 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468815 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468821 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468825 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468829 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468834 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468838 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468842 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468846 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468851 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468872 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468877 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468881 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468885 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468891 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468896 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468901 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468905 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468910 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468915 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468921 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468926 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468930 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468936 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468941 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468945 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468950 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468956 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468961 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468966 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468971 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468975 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468979 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468983 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468987 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468992 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.468996 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469000 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469004 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469008 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469011 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469015 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469018 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469023 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469026 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469030 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469035 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469040 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469044 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469047 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469051 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469055 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469059 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469062 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469065 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469069 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.469073 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.469079 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.469265 5004 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.471832 5004 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.471932 5004 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.472425 5004 server.go:997] "Starting client certificate rotation" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.472452 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.473058 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-09 01:03:02.196017395 +0000 UTC Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.473178 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.480195 5004 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.482188 5004 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.485040 5004 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.492564 5004 log.go:25] "Validated CRI v1 runtime API" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.509178 5004 log.go:25] "Validated CRI v1 image API" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.511312 5004 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.514065 5004 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-14-01-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.514105 5004 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.528558 5004 manager.go:217] Machine: {Timestamp:2025-12-03 14:06:27.5269491 +0000 UTC m=+0.275919356 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:114765c9-74c4-4d31-b4e3-ce1e142d6291 BootID:1aad4288-4570-4881-9224-58a7361c56c9 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:53:ce:6c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:53:ce:6c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:31:1b:2d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ac:d5:47 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6c:d7:50 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ec:0e:48 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:db:c4:b1:fc:dd Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:96:29:99:59:7f:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.528874 5004 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529039 5004 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529354 5004 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529535 5004 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529571 5004 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529801 5004 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.529812 5004 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.530067 5004 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.530111 5004 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.530497 5004 state_mem.go:36] "Initialized new in-memory state store" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.530588 5004 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.531422 5004 kubelet.go:418] "Attempting to sync node with API server" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.531448 5004 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.531472 5004 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.531486 5004 kubelet.go:324] "Adding apiserver pod source" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.531498 5004 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.534946 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.534998 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.535049 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.535008 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.535088 5004 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.535508 5004 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.536608 5004 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537585 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537632 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537641 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537648 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537660 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537670 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537680 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537694 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537704 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.537714 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.538015 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.543343 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.543605 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.544213 5004 server.go:1280] "Started kubelet" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.544546 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.545468 5004 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.545602 5004 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 14:06:27 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.547462 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.547491 5004 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.548040 5004 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.548075 5004 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.547294 5004 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.549541 5004 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.550224 5004 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.552533 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:37:58.143968693 +0000 UTC Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.552601 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 394h31m30.591373745s for next certificate rotation Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.553125 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187db9ae5d8e5d00 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:06:27.544161536 +0000 UTC m=+0.293131772,LastTimestamp:2025-12-03 14:06:27.544161536 +0000 UTC m=+0.293131772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.555280 5004 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.555317 5004 factory.go:55] Registering systemd factory Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.555328 5004 factory.go:221] Registration of the systemd container factory successfully Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.552762 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.555931 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.556291 5004 server.go:460] "Adding debug handlers to kubelet server" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.556781 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.557788 5004 factory.go:153] Registering CRI-O factory Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.557834 5004 factory.go:221] Registration of the crio container factory successfully Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.557877 5004 factory.go:103] Registering Raw factory Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.557900 5004 manager.go:1196] Started watching for new ooms in manager Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.559079 5004 manager.go:319] Starting recovery of all containers Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563815 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563909 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563926 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563941 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563954 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.563990 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564002 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564011 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564024 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564033 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564042 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564051 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564060 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564072 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564081 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564089 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564099 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564109 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564118 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564129 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564138 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564147 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564156 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564165 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564173 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564184 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564199 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564214 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564226 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564239 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564250 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564261 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564273 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564286 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564296 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564311 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564321 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564330 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564343 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564354 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564363 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564373 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564383 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564392 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564401 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564411 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564420 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564429 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564439 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564448 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564458 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564468 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564480 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564491 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564505 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564517 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564531 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564545 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564556 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564570 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564586 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564599 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564611 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564622 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564635 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564648 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564661 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564672 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564684 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564696 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564709 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564721 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564735 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564749 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564761 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564775 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564788 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564804 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564818 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564830 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564843 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564880 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564891 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564904 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564916 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564925 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564935 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564944 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564954 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564964 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564974 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564983 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.564993 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565003 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565012 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565020 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565028 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565036 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565045 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565054 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565085 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565094 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565103 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565112 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565126 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565137 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565146 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565156 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565173 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565190 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565200 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565210 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565219 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565247 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565256 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565265 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565273 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565283 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565292 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565300 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565308 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565318 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565328 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565336 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565345 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565354 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565364 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565373 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565382 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565390 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565398 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565406 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565414 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565422 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565430 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565438 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565445 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565454 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.565463 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567156 5004 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567219 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567239 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567252 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567265 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567280 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567295 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567307 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567356 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567370 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567390 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567402 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567415 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567429 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567441 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567453 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567465 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567477 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567492 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567504 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567519 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567531 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567544 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567558 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567570 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567584 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567596 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567611 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567624 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567636 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567650 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567664 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567677 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567691 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567709 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567722 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567734 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567746 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.567760 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569428 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569451 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569478 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569497 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569515 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569526 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569537 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569560 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569575 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569593 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569605 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569621 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569640 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569654 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569668 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569687 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569702 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569722 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569735 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569748 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569768 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569785 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569803 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569817 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569833 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569846 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569871 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569889 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569901 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569911 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569922 5004 reconstruct.go:97] "Volume reconstruction finished" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.569936 5004 reconciler.go:26] "Reconciler: start to sync state" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.585974 5004 manager.go:324] Recovery completed Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.598387 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.601031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.601074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.601085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.602409 5004 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.602428 5004 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.602455 5004 state_mem.go:36] "Initialized new in-memory state store" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.609394 5004 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.611563 5004 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.611626 5004 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.611669 5004 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.611749 5004 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 14:06:27 crc kubenswrapper[5004]: W1203 14:06:27.612436 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.612491 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.613033 5004 policy_none.go:49] "None policy: Start" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.613723 5004 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.613750 5004 state_mem.go:35] "Initializing new in-memory state store" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.649664 5004 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667191 5004 manager.go:334] "Starting Device Plugin manager" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667306 5004 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667317 5004 server.go:79] "Starting device plugin registration server" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667746 5004 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667771 5004 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.667919 5004 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.668039 5004 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.668049 5004 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.676915 5004 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.712244 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.712413 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.713658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.713722 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.713739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.713940 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.714248 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.714318 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715014 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715180 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715404 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.715460 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716247 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716419 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716524 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.716573 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717430 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717653 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717654 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717760 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.717847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718389 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718511 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718534 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.718998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.719039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.719181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.719211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.719241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.757941 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.768675 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.769982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.770022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.770034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.770060 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.770578 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771595 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771634 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771688 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771714 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771788 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771823 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771895 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771932 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.771988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.772004 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.772033 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.772050 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873410 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873534 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873561 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873610 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873670 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873742 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873806 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873776 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873959 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.873996 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874018 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874053 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874078 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874099 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874148 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874144 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874251 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874192 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874268 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874293 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874296 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.874292 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.970920 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.972116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.972158 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.972169 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:27 crc kubenswrapper[5004]: I1203 14:06:27.972194 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:27 crc kubenswrapper[5004]: E1203 14:06:27.972641 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.045957 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.065507 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.070138 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-89a9249aa54df466023734d992d6b72616e84226c492efea63c0222fc5b94f50 WatchSource:0}: Error finding container 89a9249aa54df466023734d992d6b72616e84226c492efea63c0222fc5b94f50: Status 404 returned error can't find the container with id 89a9249aa54df466023734d992d6b72616e84226c492efea63c0222fc5b94f50 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.086037 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.086430 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e83f9af18b1be9388e56ec0443722397a89af642532b727eb13ead88ddc190a3 WatchSource:0}: Error finding container e83f9af18b1be9388e56ec0443722397a89af642532b727eb13ead88ddc190a3: Status 404 returned error can't find the container with id e83f9af18b1be9388e56ec0443722397a89af642532b727eb13ead88ddc190a3 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.107090 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.115205 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.121891 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3a66588717883b1543854c133a18c573bbae31fa0d05b9b777eeb40577b0102b WatchSource:0}: Error finding container 3a66588717883b1543854c133a18c573bbae31fa0d05b9b777eeb40577b0102b: Status 404 returned error can't find the container with id 3a66588717883b1543854c133a18c573bbae31fa0d05b9b777eeb40577b0102b Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.129610 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5f5ac5094f2f3dd0c8c458712d3b08f58e99a7919c1bda1b8e06b2028e0a4343 WatchSource:0}: Error finding container 5f5ac5094f2f3dd0c8c458712d3b08f58e99a7919c1bda1b8e06b2028e0a4343: Status 404 returned error can't find the container with id 5f5ac5094f2f3dd0c8c458712d3b08f58e99a7919c1bda1b8e06b2028e0a4343 Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.159314 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.189512 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187db9ae5d8e5d00 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:06:27.544161536 +0000 UTC m=+0.293131772,LastTimestamp:2025-12-03 14:06:27.544161536 +0000 UTC m=+0.293131772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.373527 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.374928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.374965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.374977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.375003 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.375416 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.475200 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.475303 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.545783 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.616902 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.616999 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3a66588717883b1543854c133a18c573bbae31fa0d05b9b777eeb40577b0102b"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.619103 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5" exitCode=0 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.619178 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.619251 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15aca7dabbc0cba04c61b6de0e8d01f694957c03802ed89c69542c1ebe4b90eb"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.619447 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.619600 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.619654 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.620926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.620964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.620972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.622497 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.622825 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c" exitCode=0 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.622887 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.622927 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e83f9af18b1be9388e56ec0443722397a89af642532b727eb13ead88ddc190a3"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.623073 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.623440 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.623472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.623485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.624317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.624339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.624350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.625587 5004 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="796bcb0af6f5f03ec4aa89bb3dfeb97869dbee6b88d5412a73d18dc288cf628b" exitCode=0 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.625649 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"796bcb0af6f5f03ec4aa89bb3dfeb97869dbee6b88d5412a73d18dc288cf628b"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.625679 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"89a9249aa54df466023734d992d6b72616e84226c492efea63c0222fc5b94f50"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.625760 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.626602 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.626642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.626652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.627814 5004 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b" exitCode=0 Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.627840 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.627867 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5f5ac5094f2f3dd0c8c458712d3b08f58e99a7919c1bda1b8e06b2028e0a4343"} Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.627961 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.628604 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.628628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:28 crc kubenswrapper[5004]: I1203 14:06:28.628675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.742356 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.742453 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:28 crc kubenswrapper[5004]: W1203 14:06:28.938273 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.938366 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 03 14:06:28 crc kubenswrapper[5004]: E1203 14:06:28.960407 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.176534 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.183290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.183357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.183370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.183394 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:29 crc kubenswrapper[5004]: E1203 14:06:29.183909 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.589168 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.633784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.633850 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.633887 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.633902 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.633913 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.634031 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.635189 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.635219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.635245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636048 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611" exitCode=0 Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636113 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636215 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.636997 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.638294 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ad046f8be328efdde718a51fc64e5782d31488fb832540d33d7352c51c6e8341"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.638382 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.639276 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.639315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.639329 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.641004 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.641045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.641060 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.641169 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.642123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.642154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.642166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.644663 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.644701 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.644715 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da"} Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.644761 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.645545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.645603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.645624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:29 crc kubenswrapper[5004]: I1203 14:06:29.828107 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.648737 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a" exitCode=0 Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.648850 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.648908 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.649347 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a"} Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.649446 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.649782 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650167 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650445 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650420 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650445 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.650621 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.651405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.651469 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.651479 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.784938 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.786007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.786049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.786066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:30 crc kubenswrapper[5004]: I1203 14:06:30.786095 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655156 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666"} Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655224 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655508 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0"} Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655525 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b"} Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168"} Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655547 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d"} Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.655670 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656534 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656549 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:31 crc kubenswrapper[5004]: I1203 14:06:31.656558 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.519239 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.657570 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.659019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.659052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.659063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:32 crc kubenswrapper[5004]: I1203 14:06:32.768325 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.108439 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.108710 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.110326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.110377 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.110390 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.136047 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.661146 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.661345 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.664459 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.665593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.665645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.665647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.665692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.665704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:33 crc kubenswrapper[5004]: I1203 14:06:33.877971 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:34 crc kubenswrapper[5004]: I1203 14:06:34.662958 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:34 crc kubenswrapper[5004]: I1203 14:06:34.664284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:34 crc kubenswrapper[5004]: I1203 14:06:34.664335 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:34 crc kubenswrapper[5004]: I1203 14:06:34.664346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.018588 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.018820 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.021050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.021102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.021117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.108986 5004 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.109081 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.148304 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.667471 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.668344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.668492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.668584 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.703488 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.703671 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.704672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.704699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:36 crc kubenswrapper[5004]: I1203 14:06:36.704707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:37 crc kubenswrapper[5004]: E1203 14:06:37.677039 5004 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.566512 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.566679 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.567742 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.567782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.567793 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.571602 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.673040 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.674088 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.674135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.674148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:38 crc kubenswrapper[5004]: I1203 14:06:38.677293 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.546027 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 14:06:39 crc kubenswrapper[5004]: E1203 14:06:39.593067 5004 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.674932 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.675765 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.675798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.675811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.828670 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.828750 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.834233 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.834295 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.839273 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 14:06:39 crc kubenswrapper[5004]: I1203 14:06:39.839351 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.525662 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.525815 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.526201 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.526311 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.526778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.526801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.526810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.530188 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.697128 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.697552 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.697605 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.698058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.698094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:42 crc kubenswrapper[5004]: I1203 14:06:42.698104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:43 crc kubenswrapper[5004]: I1203 14:06:43.668930 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 14:06:43 crc kubenswrapper[5004]: I1203 14:06:43.689929 5004 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 14:06:44 crc kubenswrapper[5004]: E1203 14:06:44.840033 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842474 5004 trace.go:236] Trace[1640688117]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:06:32.131) (total time: 12710ms): Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1640688117]: ---"Objects listed" error: 12710ms (14:06:44.842) Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1640688117]: [12.710627856s] [12.710627856s] END Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842532 5004 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842811 5004 trace.go:236] Trace[1941429196]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:06:30.645) (total time: 14197ms): Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1941429196]: ---"Objects listed" error: 14197ms (14:06:44.842) Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1941429196]: [14.197717403s] [14.197717403s] END Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842828 5004 trace.go:236] Trace[753031719]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:06:31.218) (total time: 13624ms): Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[753031719]: ---"Objects listed" error: 13624ms (14:06:44.842) Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[753031719]: [13.624082655s] [13.624082655s] END Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842888 5004 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.842835 5004 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.843606 5004 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.845578 5004 trace.go:236] Trace[1527504028]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 14:06:31.370) (total time: 13474ms): Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1527504028]: ---"Objects listed" error: 13474ms (14:06:44.845) Dec 03 14:06:44 crc kubenswrapper[5004]: Trace[1527504028]: [13.4748147s] [13.4748147s] END Dec 03 14:06:44 crc kubenswrapper[5004]: I1203 14:06:44.845617 5004 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 14:06:44 crc kubenswrapper[5004]: E1203 14:06:44.847133 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.542243 5004 apiserver.go:52] "Watching apiserver" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.546026 5004 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.546322 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.546740 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.546816 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.546889 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.546740 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.547566 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.547606 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.547632 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.547650 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.547567 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.549369 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.549387 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.549381 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.549587 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.550128 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.550218 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.550494 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.551151 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.551209 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.554684 5004 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.598523 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.611580 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.626310 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.635191 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648587 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648644 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648666 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648688 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648756 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648777 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648796 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648819 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648839 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648879 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648902 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648921 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648971 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.648996 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649014 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649057 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649077 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649096 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649116 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649138 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649157 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649144 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649181 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649206 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649231 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649295 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649372 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649575 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649255 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649615 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649637 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649662 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649686 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649731 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649752 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649774 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649799 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649835 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649878 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649901 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649925 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649948 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649970 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649992 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650014 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650060 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650109 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650134 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650164 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650188 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650219 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650243 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650287 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650310 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650331 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650355 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650375 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650396 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650421 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650446 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650469 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650492 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650516 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650538 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650559 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650586 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650714 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650739 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650776 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650803 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650827 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650849 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650890 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650935 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650957 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650977 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651000 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651022 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651045 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651067 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651088 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651111 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651134 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651156 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651179 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651200 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651223 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651241 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651257 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651275 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651292 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651328 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651347 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651364 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651381 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651431 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651459 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651481 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651506 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651528 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651547 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651563 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651579 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651597 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651621 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651667 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651691 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651715 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651735 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651754 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651779 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651800 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649770 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649922 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649932 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.649964 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650136 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650132 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650121 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650362 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650415 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650462 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650534 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650587 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650674 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650670 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650816 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.650981 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651013 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651065 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651179 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651191 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651214 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651243 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651331 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651375 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651413 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652040 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651576 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651588 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651617 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651638 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651716 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651796 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651813 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652036 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651447 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652139 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652239 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652717 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.652797 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.653051 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.653078 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.653495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.653586 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.653952 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.654752 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.654895 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.655032 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.655360 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.655471 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.655827 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.656706 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.656760 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657059 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657628 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657941 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657958 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657974 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.658065 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:06:46.158041842 +0000 UTC m=+18.907012158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658185 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658293 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.657947 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658565 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658578 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658921 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658963 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.658927 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.659347 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.656076 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.659568 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660291 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.651821 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660565 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660598 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660626 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660655 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660678 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660700 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660725 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660840 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660878 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660888 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660901 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660952 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660972 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.660995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661000 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661017 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661039 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661062 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661087 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661112 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661135 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661158 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661182 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661203 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661224 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661246 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661268 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661269 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661292 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661315 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661339 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661361 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661387 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661391 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661439 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661462 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661486 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661506 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661531 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661535 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661552 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661572 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661582 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661591 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661636 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661664 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661688 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.661974 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662047 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662108 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662154 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662181 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662205 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662229 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662250 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662273 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662295 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662392 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662482 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662508 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662530 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662553 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662569 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662574 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662642 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662672 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662694 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662838 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663109 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663281 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663647 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.662720 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663953 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.663978 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664002 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664026 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664046 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664050 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664103 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664137 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664162 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664247 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664272 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664298 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664342 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664371 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664415 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664425 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664483 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664512 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664570 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664622 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664649 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664671 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664800 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664820 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664835 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664849 5004 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664887 5004 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664903 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664920 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664933 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664947 5004 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664959 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664971 5004 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664983 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664996 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665009 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665022 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665034 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665046 5004 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665058 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665070 5004 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665082 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665094 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665106 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665118 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665128 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665140 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665153 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665165 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665177 5004 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665191 5004 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665203 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665215 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665229 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665240 5004 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665252 5004 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665263 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665275 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665288 5004 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665301 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665316 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665330 5004 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665343 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665355 5004 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665367 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665380 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665395 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665409 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665422 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665435 5004 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665447 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665460 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665471 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665482 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665494 5004 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665507 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665519 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665532 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665545 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665556 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665569 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665584 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665598 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665611 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665625 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665637 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665650 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665662 5004 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665676 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665688 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665705 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665717 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665728 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665740 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665753 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665764 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665776 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665788 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665800 5004 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665812 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665824 5004 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665835 5004 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665846 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665887 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665901 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665912 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665925 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665936 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665947 5004 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665960 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665973 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665985 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665996 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.674790 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.678112 5004 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.682263 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.683824 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.685743 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664479 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664710 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.664773 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665016 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.687491 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.687590 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.687826 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.688344 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.688345 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.688787 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.689004 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.689328 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.689922 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.690211 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.690405 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.690414 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665600 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665727 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.690459 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.691103 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.691197 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.691479 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.692066 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.690819 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.694210 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.694664 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.694931 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665405 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.666084 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.695277 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:46.195201061 +0000 UTC m=+18.944171367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.695281 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666368 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666513 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666734 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666793 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666906 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.666909 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667207 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667246 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667177 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667300 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667475 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667502 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667659 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667677 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.667955 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.668293 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.668378 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.668699 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.668715 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.669016 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.669023 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.669304 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.669344 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.669371 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.670242 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.670257 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.670284 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671356 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671501 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671620 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671706 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671767 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.671913 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.672068 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.695703 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:46.195674115 +0000 UTC m=+18.944644341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.676549 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.676890 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.676894 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.677571 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.677725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.678016 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.678783 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.678815 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.679058 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.679255 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.679662 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.680019 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.685902 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.665948 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.696113 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.696617 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.696989 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.698678 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.698955 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.699141 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.699576 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.699603 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.699618 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.699680 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:46.199660262 +0000 UTC m=+18.948630578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.700149 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.700329 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.700362 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.700378 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:45 crc kubenswrapper[5004]: E1203 14:06:45.700413 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:46.200403214 +0000 UTC m=+18.949373450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.700937 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.703976 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.704008 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.704109 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.704613 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.704712 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.708279 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.708610 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.709170 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.709256 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.709817 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.711365 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.714823 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.715977 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.718393 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.720476 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.720932 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.720964 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.721184 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.721387 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.723219 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2" exitCode=255 Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.723257 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2"} Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.723469 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.723026 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.726261 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.727212 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.733743 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.736586 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.739569 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.744229 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.744541 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.759901 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.762202 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769384 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769443 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769516 5004 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769530 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769542 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769555 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769570 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769582 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769593 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769603 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769617 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769628 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769639 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769650 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769664 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769675 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769686 5004 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769700 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769711 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769722 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769734 5004 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769748 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769761 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769771 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769782 5004 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769797 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769810 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769821 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769835 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769846 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769874 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769886 5004 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769902 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769913 5004 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769924 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769935 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769949 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769960 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769972 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769984 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.769999 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770010 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770020 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770034 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770045 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770057 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770071 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770082 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770095 5004 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770107 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770118 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770118 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770128 5004 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770145 5004 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770147 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770157 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770279 5004 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770295 5004 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770305 5004 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770315 5004 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770331 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770340 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770349 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770359 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770372 5004 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770381 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770393 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770402 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770414 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770423 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770433 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770453 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770461 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770470 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770479 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770490 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770500 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770509 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770518 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770529 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770537 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770547 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770556 5004 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770568 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770576 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770585 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770596 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770606 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770615 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770624 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770635 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770645 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770654 5004 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770665 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770677 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770686 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770695 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770707 5004 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770716 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770725 5004 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.770765 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773550 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773582 5004 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773592 5004 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773603 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773615 5004 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.773629 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.783167 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.791047 5004 csr.go:261] certificate signing request csr-xx744 is approved, waiting to be issued Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.801009 5004 csr.go:257] certificate signing request csr-xx744 is issued Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.802737 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.815535 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.818805 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.818902 5004 scope.go:117] "RemoveContainer" containerID="3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.836526 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.850581 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.863468 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.874029 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-z2zlx"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.874391 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.875250 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.878151 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.878824 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.878356 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.883616 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.896088 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.912658 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.938199 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.968213 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.974092 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-m4g6v"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.974486 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.975748 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f436781-8ddc-4947-a631-d020ec46f63a-hosts-file\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.975790 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7pqb\" (UniqueName: \"kubernetes.io/projected/3f436781-8ddc-4947-a631-d020ec46f63a-kube-api-access-z7pqb\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.976995 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.977196 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.977378 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.981298 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.985259 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.985378 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kvvjx"] Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.985751 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.987736 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.987875 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.987946 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.987740 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 14:06:45 crc kubenswrapper[5004]: I1203 14:06:45.998095 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.006078 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.016937 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.029610 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.042007 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.054060 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.063914 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.073536 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.076722 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5acc6204-5c3a-4d00-9d86-13415fb3f68f-host\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.076761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndf4\" (UniqueName: \"kubernetes.io/projected/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-kube-api-access-tndf4\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.076787 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-mcd-auth-proxy-config\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.076957 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-rootfs\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077021 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f436781-8ddc-4947-a631-d020ec46f63a-hosts-file\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077087 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-proxy-tls\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077127 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7pqb\" (UniqueName: \"kubernetes.io/projected/3f436781-8ddc-4947-a631-d020ec46f63a-kube-api-access-z7pqb\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077150 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5acc6204-5c3a-4d00-9d86-13415fb3f68f-serviceca\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077168 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f436781-8ddc-4947-a631-d020ec46f63a-hosts-file\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.077176 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsq24\" (UniqueName: \"kubernetes.io/projected/5acc6204-5c3a-4d00-9d86-13415fb3f68f-kube-api-access-tsq24\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.088399 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.097246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7pqb\" (UniqueName: \"kubernetes.io/projected/3f436781-8ddc-4947-a631-d020ec46f63a-kube-api-access-z7pqb\") pod \"node-resolver-z2zlx\" (UID: \"3f436781-8ddc-4947-a631-d020ec46f63a\") " pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.098171 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.110777 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.121875 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178481 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178563 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-proxy-tls\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178614 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5acc6204-5c3a-4d00-9d86-13415fb3f68f-serviceca\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178635 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsq24\" (UniqueName: \"kubernetes.io/projected/5acc6204-5c3a-4d00-9d86-13415fb3f68f-kube-api-access-tsq24\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.178675 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:06:47.178652341 +0000 UTC m=+19.927622587 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178700 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5acc6204-5c3a-4d00-9d86-13415fb3f68f-host\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178729 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndf4\" (UniqueName: \"kubernetes.io/projected/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-kube-api-access-tndf4\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178754 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-mcd-auth-proxy-config\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178783 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-rootfs\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178834 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-rootfs\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.178890 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5acc6204-5c3a-4d00-9d86-13415fb3f68f-host\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.180249 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-mcd-auth-proxy-config\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.182281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-proxy-tls\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.189079 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z2zlx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.190884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5acc6204-5c3a-4d00-9d86-13415fb3f68f-serviceca\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.200768 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsq24\" (UniqueName: \"kubernetes.io/projected/5acc6204-5c3a-4d00-9d86-13415fb3f68f-kube-api-access-tsq24\") pod \"node-ca-kvvjx\" (UID: \"5acc6204-5c3a-4d00-9d86-13415fb3f68f\") " pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.201547 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: W1203 14:06:46.201825 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f436781_8ddc_4947_a631_d020ec46f63a.slice/crio-f95e69aff1073d3fb182e521b1e4b5b7574167d38432e66f16dc0aed97679e75 WatchSource:0}: Error finding container f95e69aff1073d3fb182e521b1e4b5b7574167d38432e66f16dc0aed97679e75: Status 404 returned error can't find the container with id f95e69aff1073d3fb182e521b1e4b5b7574167d38432e66f16dc0aed97679e75 Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.211771 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndf4\" (UniqueName: \"kubernetes.io/projected/7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94-kube-api-access-tndf4\") pod \"machine-config-daemon-m4g6v\" (UID: \"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\") " pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.216878 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.226883 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.244280 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.252195 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.279402 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.279511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.279558 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.279591 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279703 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279734 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279780 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279773 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279800 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279958 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:47.279922289 +0000 UTC m=+20.028892675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.280033 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:47.280006892 +0000 UTC m=+20.028977288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.279746 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.280064 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.280090 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:47.280082514 +0000 UTC m=+20.029052960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.280663 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.280743 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:47.280728153 +0000 UTC m=+20.029698569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.283594 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.312592 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.316591 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.321242 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kvvjx" Dec 03 14:06:46 crc kubenswrapper[5004]: W1203 14:06:46.335268 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6cf6ea_c7f7_44bb_b1fa_9e8d5f1d9c94.slice/crio-9e6a1b4a22bf05a07bb1da8aacaac249cf1cbc8435b3901af06c50f439615b8e WatchSource:0}: Error finding container 9e6a1b4a22bf05a07bb1da8aacaac249cf1cbc8435b3901af06c50f439615b8e: Status 404 returned error can't find the container with id 9e6a1b4a22bf05a07bb1da8aacaac249cf1cbc8435b3901af06c50f439615b8e Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.352156 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.371161 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.372529 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.396532 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.418543 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.430473 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.449326 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.464992 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.482202 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.510227 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.523362 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.615344 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.615568 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.647838 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.726267 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z2zlx" event={"ID":"3f436781-8ddc-4947-a631-d020ec46f63a","Type":"ContainerStarted","Data":"d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.726317 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z2zlx" event={"ID":"3f436781-8ddc-4947-a631-d020ec46f63a","Type":"ContainerStarted","Data":"f95e69aff1073d3fb182e521b1e4b5b7574167d38432e66f16dc0aed97679e75"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.727198 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"66150b598dd5aad44796aa71b292323cb113fc8fa6475294f27cfdcc14dca37f"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.728512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.728570 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.728584 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1519aa57c1c7755e74825a6578d8c604fe8fe7a982aa1beaa721b727304d8868"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.740404 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.746215 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvbnm"] Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.753251 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mjjss"] Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.754145 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.754560 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.755953 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.756450 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.758670 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.758719 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.758732 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"9e6a1b4a22bf05a07bb1da8aacaac249cf1cbc8435b3901af06c50f439615b8e"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.759193 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.759344 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.759465 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.759468 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.759818 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.760054 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.760101 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.760207 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.760210 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s6kp7"] Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.760679 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.764502 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.764552 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"67f99448db1659b4f0de8a65e64a7785e8369a03a02a633fd37a6c8444e7a49e"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.764925 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.765036 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.765116 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.765292 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.765314 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.765370 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.768013 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kvvjx" event={"ID":"5acc6204-5c3a-4d00-9d86-13415fb3f68f","Type":"ContainerStarted","Data":"93e5de55536fdb0bcb3378be8f3cace7756b82bdb5cb287613d35ed1916a4632"} Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.774053 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785335 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785389 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785419 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785444 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785464 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785500 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-system-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785532 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785552 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-cnibin\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785570 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-hostroot\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785586 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785603 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-k8s-cni-cncf-io\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785584 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785622 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-netns\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785660 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-multus\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785699 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-os-release\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785728 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-kubelet\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785747 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-multus-certs\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.785781 5004 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.785785 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786155 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzrc\" (UniqueName: \"kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-system-cni-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-daemon-config\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786256 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-etc-kubernetes\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786274 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786292 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cnibin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786312 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786333 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786353 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786406 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786428 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-os-release\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786528 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786571 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-binary-copy\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786594 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cni-binary-copy\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786621 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-socket-dir-parent\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: E1203 14:06:46.786724 5004 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786841 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-bin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786909 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqsn\" (UniqueName: \"kubernetes.io/projected/180b8370-535b-42de-9d0a-cf5e572c9480-kube-api-access-qlqsn\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786960 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp5n4\" (UniqueName: \"kubernetes.io/projected/ff08cd56-3e47-4cd7-98ad-8571f178dc62-kube-api-access-lp5n4\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.786983 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.787004 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-conf-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.796843 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.802448 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-03 14:01:45 +0000 UTC, rotation deadline is 2026-08-27 02:17:24.684076792 +0000 UTC Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.802495 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6396h10m37.881584266s for next certificate rotation Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.810046 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.820298 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.829666 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.845043 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.856583 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.866800 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888157 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888193 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-conf-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888209 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp5n4\" (UniqueName: \"kubernetes.io/projected/ff08cd56-3e47-4cd7-98ad-8571f178dc62-kube-api-access-lp5n4\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888251 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888283 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888288 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888335 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888298 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888368 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-conf-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888395 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888433 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-system-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888451 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-cnibin\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888468 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-hostroot\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888504 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-k8s-cni-cncf-io\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888560 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-netns\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888485 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888631 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888632 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-multus\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888655 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888579 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-multus\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888823 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-os-release\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888849 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-kubelet\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888886 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-multus-certs\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888921 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888942 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzrc\" (UniqueName: \"kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888962 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-system-cni-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.888992 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889018 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-daemon-config\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889099 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889154 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889233 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889337 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-cnibin\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889358 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-hostroot\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889371 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-k8s-cni-cncf-io\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889399 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889400 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-netns\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-system-cni-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889454 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889472 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-run-multus-certs\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889529 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-system-cni-dir\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889528 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-kubelet\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-etc-kubernetes\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889607 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889626 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889661 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cnibin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889674 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-os-release\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889681 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889682 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889718 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889711 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889738 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-etc-kubernetes\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889730 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889874 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.889972 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cnibin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890015 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-os-release\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890086 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890100 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-os-release\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890112 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890145 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890167 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890220 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890225 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-binary-copy\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-daemon-config\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890243 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890251 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cni-binary-copy\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890500 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-socket-dir-parent\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890723 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/180b8370-535b-42de-9d0a-cf5e572c9480-cni-binary-copy\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891250 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff08cd56-3e47-4cd7-98ad-8571f178dc62-cni-binary-copy\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.890313 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-multus-socket-dir-parent\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891328 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-bin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891352 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891409 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff08cd56-3e47-4cd7-98ad-8571f178dc62-host-var-lib-cni-bin\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891445 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqsn\" (UniqueName: \"kubernetes.io/projected/180b8370-535b-42de-9d0a-cf5e572c9480-kube-api-access-qlqsn\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.891513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.892958 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/180b8370-535b-42de-9d0a-cf5e572c9480-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.894164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.907680 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp5n4\" (UniqueName: \"kubernetes.io/projected/ff08cd56-3e47-4cd7-98ad-8571f178dc62-kube-api-access-lp5n4\") pod \"multus-s6kp7\" (UID: \"ff08cd56-3e47-4cd7-98ad-8571f178dc62\") " pod="openshift-multus/multus-s6kp7" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.919897 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzrc\" (UniqueName: \"kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc\") pod \"ovnkube-node-kvbnm\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.962067 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqsn\" (UniqueName: \"kubernetes.io/projected/180b8370-535b-42de-9d0a-cf5e572c9480-kube-api-access-qlqsn\") pod \"multus-additional-cni-plugins-mjjss\" (UID: \"180b8370-535b-42de-9d0a-cf5e572c9480\") " pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.962679 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:46 crc kubenswrapper[5004]: I1203 14:06:46.992287 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:46Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.035882 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.074200 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mjjss" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.082721 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.085325 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.092090 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6kp7" Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.092278 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod180b8370_535b_42de_9d0a_cf5e572c9480.slice/crio-bcbaa18e588c4752219e2659da949b58aa7d48e9e6036efd333734959586fcea WatchSource:0}: Error finding container bcbaa18e588c4752219e2659da949b58aa7d48e9e6036efd333734959586fcea: Status 404 returned error can't find the container with id bcbaa18e588c4752219e2659da949b58aa7d48e9e6036efd333734959586fcea Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.098088 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78eea523_e8ee_4f41_93b2_6bbfdcdf3371.slice/crio-c505d925a7e26ca1511514d826505389a5135fea6cf726e7d4e2d4795c21255d WatchSource:0}: Error finding container c505d925a7e26ca1511514d826505389a5135fea6cf726e7d4e2d4795c21255d: Status 404 returned error can't find the container with id c505d925a7e26ca1511514d826505389a5135fea6cf726e7d4e2d4795c21255d Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.138875 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.161485 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.194789 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.194900 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:06:49.194884428 +0000 UTC m=+21.943854664 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.196945 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.236719 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.273711 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.295658 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.295699 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.295728 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.295755 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295823 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295896 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295912 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295905 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:49.295883868 +0000 UTC m=+22.044854114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295933 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295950 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295951 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:49.29593977 +0000 UTC m=+22.044910006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295952 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.296005 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.296025 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.295983 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:49.295972271 +0000 UTC m=+22.044942507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.296135 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:49.296111375 +0000 UTC m=+22.045081611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.317844 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.360754 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.407803 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.434362 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.473722 5004 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474051 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474204 5004 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474458 5004 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.474524 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-m4g6v/status\": read tcp 38.102.83.38:57058->38.102.83.38:6443: use of closed network connection" Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474740 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474803 5004 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474843 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474877 5004 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474898 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474928 5004 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.475014 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.475050 5004 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.474051 5004 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.475087 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: W1203 14:06:47.475370 5004 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.521362 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.560040 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.594101 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.612924 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.612979 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.613110 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:47 crc kubenswrapper[5004]: E1203 14:06:47.616336 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.619655 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.620473 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.621741 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.622552 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.623815 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.624458 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.625167 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.626425 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.627546 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.628748 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.629378 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.630505 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.631085 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.631596 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.632461 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.633001 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.633968 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.634549 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.635094 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.636149 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.636681 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.637827 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.638359 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.638754 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.639146 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.640239 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.641071 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.642245 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.642810 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.643993 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.644799 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.645391 5004 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.645584 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.647025 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.647595 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.648091 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.649280 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.650033 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.650597 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.651338 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.654530 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.655129 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.656071 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.657137 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.657886 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.658910 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.659575 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.661526 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.662681 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.663730 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.664289 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.664846 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.666066 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.666769 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.667875 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.674250 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.714128 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.759000 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.772657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerStarted","Data":"38b3decee844f138f95ae4b0c950b7f3d7a481ce3a21fa5196e9be309f4f8e71"} Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.774337 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerStarted","Data":"bcbaa18e588c4752219e2659da949b58aa7d48e9e6036efd333734959586fcea"} Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.776478 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kvvjx" event={"ID":"5acc6204-5c3a-4d00-9d86-13415fb3f68f","Type":"ContainerStarted","Data":"e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0"} Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.778520 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"c505d925a7e26ca1511514d826505389a5135fea6cf726e7d4e2d4795c21255d"} Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.806385 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.852650 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.882404 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.941662 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.970256 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:47 crc kubenswrapper[5004]: I1203 14:06:47.991601 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.036485 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.047977 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.056319 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.056363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.056373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.056506 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.099582 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.105292 5004 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.105638 5004 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.106883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.106921 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.106933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.106949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.106962 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.124426 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.128453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.128492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.128503 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.128520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.128532 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.140365 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.145049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.145112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.145126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.145150 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.145162 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.151998 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.158094 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.163365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.163446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.163462 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.163483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.163496 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.179843 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.184205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.184239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.184250 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.184266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.184278 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.193063 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.197976 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.198094 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.199880 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.199905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.199913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.199926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.199935 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.245364 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.284796 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.302392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.302431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.302442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.302461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.302474 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.405171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.405228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.405239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.405255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.405265 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.493162 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.507890 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.507943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.507958 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.507977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.507992 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.513024 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.568904 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.580036 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.610499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.610766 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.610835 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.610928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.610987 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.612740 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:48 crc kubenswrapper[5004]: E1203 14:06:48.612965 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.675812 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.712823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.712877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.712891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.712906 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.712918 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.734735 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.745420 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.776460 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.785786 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerStarted","Data":"76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.791339 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e" exitCode=0 Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.791485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.794687 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" exitCode=0 Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.794725 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.807415 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.819247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.819547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.819560 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.819577 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.819588 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.834558 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.849671 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.867470 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.882637 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.895079 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.902632 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.911839 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.924367 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.924408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.924417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.924439 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.924452 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:48Z","lastTransitionTime":"2025-12-03T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.925616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.941060 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.963211 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.979192 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:48 crc kubenswrapper[5004]: I1203 14:06:48.993931 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.003475 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.014024 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.015468 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.027039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.027096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.027110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.027137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.027154 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.034961 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.073399 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.083616 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.104764 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.129780 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.129816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.129826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.129843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.129873 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.161803 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.200285 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.217328 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.217537 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:06:53.217504794 +0000 UTC m=+25.966475020 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.232127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.232167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.232178 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.232195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.232206 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.234981 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.271646 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.312256 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.318637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.318684 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.318710 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.318731 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318843 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318875 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318882 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318887 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318929 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318908 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318979 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:53.318955777 +0000 UTC m=+26.067926093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318990 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.319044 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:53.319026939 +0000 UTC m=+26.067997285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.318886 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.319065 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:53.31905398 +0000 UTC m=+26.068024216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.319097 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:06:53.319083971 +0000 UTC m=+26.068054207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.334552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.334603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.334617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.334639 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.334650 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.353098 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.391906 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.434660 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.437403 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.437453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.437464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.437483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.437495 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.478360 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.513781 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.540171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.540228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.540242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.540264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.540278 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.554398 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.594568 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.612993 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.613119 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.613284 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:49 crc kubenswrapper[5004]: E1203 14:06:49.613556 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.634241 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.643373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.643424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.643438 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.643456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.643467 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.700135 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.731537 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.749229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.749273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.749284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.749300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.749312 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.802738 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.802795 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.802807 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.802820 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.802831 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.805183 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1" exitCode=0 Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.806099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.828754 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.843126 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.851781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.851816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.851826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.851840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.851852 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.863616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.889165 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.914421 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.954797 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.954989 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.955009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.955019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.955037 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.955049 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:49Z","lastTransitionTime":"2025-12-03T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:49 crc kubenswrapper[5004]: I1203 14:06:49.997746 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:49Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.034394 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.057180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.057218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.057230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.057246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.057258 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.071941 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.113576 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.153588 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.159269 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.159590 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.159669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.159765 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.159846 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.194501 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.232285 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.262043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.262290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.262448 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.262559 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.262647 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.275448 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.315261 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.365181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.365454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.365555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.365663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.365750 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.468709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.468752 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.468765 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.468783 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.468796 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.570966 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.571008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.571019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.571037 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.571051 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.611847 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:50 crc kubenswrapper[5004]: E1203 14:06:50.612002 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.673821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.673957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.673977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.674004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.674021 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.778030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.778083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.778094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.778113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.778126 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.812708 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.815212 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb" exitCode=0 Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.815300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.817306 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.832839 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.847434 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.861304 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.875943 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.880136 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.880172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.880183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.880202 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.880215 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.889999 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.903299 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.916243 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.933007 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.948965 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.965088 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.979170 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.983360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.983393 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.983402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.983417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.983426 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:50Z","lastTransitionTime":"2025-12-03T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:50 crc kubenswrapper[5004]: I1203 14:06:50.997371 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.019787 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.032998 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.052081 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.066540 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.080345 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.085190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.085231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.085245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.085264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.085277 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.092979 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.108316 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.122033 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.158574 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.187972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.188034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.188052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.188073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.188085 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.197045 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.234961 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.277675 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.290981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.291069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.291084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.291110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.291126 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.324479 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.355059 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.393567 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.393976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.394006 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.394016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.394031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.394040 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.442424 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.474084 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.496899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.496944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.496955 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.496976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.496987 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.515932 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.600177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.600233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.600246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.600266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.600281 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.612109 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.612187 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:51 crc kubenswrapper[5004]: E1203 14:06:51.612416 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:51 crc kubenswrapper[5004]: E1203 14:06:51.612236 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.702502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.702596 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.702610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.702637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.702652 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.804762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.804819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.804829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.804845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.804886 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.823283 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695" exitCode=0 Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.823398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.839275 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.857157 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.872685 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.892578 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.908149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.908208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.908221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.908241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.908254 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:51Z","lastTransitionTime":"2025-12-03T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.914722 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.929688 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.939753 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.962643 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.984015 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:51 crc kubenswrapper[5004]: I1203 14:06:51.996090 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.009325 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.011745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.011792 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.011806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.011824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.011835 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.022724 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.035175 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.072283 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.114245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.114286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.114296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.114311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.114320 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.116588 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.217106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.217363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.217373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.217391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.217403 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.319733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.319768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.319779 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.319797 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.319809 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.423274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.423336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.423348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.423382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.423396 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.525693 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.525750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.525763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.525782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.525803 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.613120 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:52 crc kubenswrapper[5004]: E1203 14:06:52.613397 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.628830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.628896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.628908 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.628929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.628943 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.731714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.731844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.731881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.731902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.731914 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.829928 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.832430 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8" exitCode=0 Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.832466 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.834039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.834063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.834072 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.834084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.834094 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.851925 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.864961 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.874365 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.888839 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.902211 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.914748 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.925768 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937086 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937122 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937160 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:52Z","lastTransitionTime":"2025-12-03T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.937480 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.955801 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:52 crc kubenswrapper[5004]: I1203 14:06:52.991169 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:52Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.005739 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.015652 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.031293 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.039628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.039763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.039843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.039943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.040002 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.044824 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.059116 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:53Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.141743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.141795 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.141809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.141830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.141844 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.245382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.245420 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.245431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.245450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.245468 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.262914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.263123 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.263108323 +0000 UTC m=+34.012078549 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.348196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.348240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.348250 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.348265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.348275 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.364228 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.364294 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.364320 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.364356 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364490 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364533 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364551 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.364534386 +0000 UTC m=+34.113504622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364733 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.364699591 +0000 UTC m=+34.113670017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364943 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364970 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.364991 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.365047 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.365031931 +0000 UTC m=+34.114002367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.365297 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.365418 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.365506 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.365662 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.365640398 +0000 UTC m=+34.114610634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.450084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.450116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.450124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.450137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.450148 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.551814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.552060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.552171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.552241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.552319 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.612614 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.612725 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.612750 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:53 crc kubenswrapper[5004]: E1203 14:06:53.612826 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.654914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.654953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.654962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.654976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.654984 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.758318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.758391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.758405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.758430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.758444 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.861780 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.861811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.861820 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.861834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.861843 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.965485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.965557 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.965579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.965614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:53 crc kubenswrapper[5004]: I1203 14:06:53.965633 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:53Z","lastTransitionTime":"2025-12-03T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.068693 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.068736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.068745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.068762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.068772 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.171367 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.171430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.171443 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.171482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.171498 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.274494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.274542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.274555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.274575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.274587 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.376747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.376796 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.376806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.376827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.376838 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.479975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.480033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.480043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.480063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.480076 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.582791 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.582834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.582847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.582884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.582895 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.612328 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:54 crc kubenswrapper[5004]: E1203 14:06:54.612451 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.684816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.684851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.684873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.684887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.684898 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.787750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.787786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.787795 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.787810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.787828 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.844239 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerStarted","Data":"e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.890247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.890301 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.890314 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.890332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.890344 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.993197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.993265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.993276 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.993300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:54 crc kubenswrapper[5004]: I1203 14:06:54.993314 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:54Z","lastTransitionTime":"2025-12-03T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.096530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.096571 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.096580 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.096595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.096603 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.199501 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.199560 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.199578 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.199602 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.199628 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.302822 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.302919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.302934 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.302957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.302972 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.406734 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.406787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.406797 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.406819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.406833 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.510271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.510570 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.510579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.510594 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.510605 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612343 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612426 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:55 crc kubenswrapper[5004]: E1203 14:06:55.612451 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:55 crc kubenswrapper[5004]: E1203 14:06:55.612562 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612636 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.612687 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.714621 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.714652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.714660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.714672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.714681 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.817291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.817326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.817338 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.817357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.817368 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.853352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.854526 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.854558 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.854606 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.859746 5004 generic.go:334] "Generic (PLEG): container finished" podID="180b8370-535b-42de-9d0a-cf5e572c9480" containerID="e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f" exitCode=0 Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.859798 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerDied","Data":"e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.868120 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.877951 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.878075 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.881181 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.893486 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.905310 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.923364 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.925961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.926021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.926032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.926049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.926061 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:55Z","lastTransitionTime":"2025-12-03T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.937549 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.955340 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.971345 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:55 crc kubenswrapper[5004]: I1203 14:06:55.987129 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:55Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.005554 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.018757 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.028679 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.028701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.028712 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.028727 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.028736 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.031390 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.052393 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.065154 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.077574 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.097165 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.120792 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.130946 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.130992 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.131005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.131022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.131034 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.133000 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.143634 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.154019 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.165451 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.174080 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.185047 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.194989 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.204730 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.213021 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.226035 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.233279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.233307 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.233316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.233331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.233342 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.239420 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.249932 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.260968 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:56Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.335590 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.335628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.335638 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.335653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.335664 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.437468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.437506 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.437515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.437531 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.437541 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.539703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.539741 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.539750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.539763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.539772 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.612934 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:56 crc kubenswrapper[5004]: E1203 14:06:56.613063 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.641972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.642035 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.642054 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.642078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.642097 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.744988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.745533 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.745550 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.745569 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.745580 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.847339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.847380 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.847401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.847417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.847430 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.870639 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" event={"ID":"180b8370-535b-42de-9d0a-cf5e572c9480","Type":"ContainerStarted","Data":"7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67"} Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.950016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.950710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.950748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.950768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:56 crc kubenswrapper[5004]: I1203 14:06:56.950781 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:56Z","lastTransitionTime":"2025-12-03T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.052896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.052961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.052972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.052994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.053008 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.155601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.155683 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.155716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.155749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.155773 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.257762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.257803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.257815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.257832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.257844 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.360606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.360646 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.360658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.360675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.360687 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.463042 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.463087 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.463098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.463113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.463125 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.565243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.565280 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.565288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.565303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.565313 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.612263 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.612342 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:57 crc kubenswrapper[5004]: E1203 14:06:57.612442 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:57 crc kubenswrapper[5004]: E1203 14:06:57.612497 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.639934 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.661988 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.670036 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.670094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.670111 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.670134 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.670155 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.677616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.690021 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.699619 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.709630 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.721013 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.733832 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.745968 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.756209 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.769086 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.772611 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.772647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.772660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.772674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.772684 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.794047 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.811891 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.824433 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.842750 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.874612 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.874647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.874658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.874673 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.874684 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.893809 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.903505 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.917830 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.929547 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.940086 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.947826 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.962611 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.977461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.977501 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.977511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.977528 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.977539 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:57Z","lastTransitionTime":"2025-12-03T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.980944 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:57 crc kubenswrapper[5004]: I1203 14:06:57.999167 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.010930 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.028660 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.038959 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.049962 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.064039 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.074774 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.079138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.079167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.079177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.079192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.079201 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.215225 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.215254 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.215261 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.215274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.215282 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.317769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.317818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.317832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.317911 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.317925 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.329753 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.329793 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.329801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.329816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.329825 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.341393 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.345214 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.345258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.345300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.345316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.345325 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.356247 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.359847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.359913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.359926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.359944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.359955 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.371625 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.375091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.375129 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.375141 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.375158 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.375170 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.386455 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.389170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.389204 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.389221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.389237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.389249 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.400617 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:58Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.400727 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.419762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.419807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.419819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.419837 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.419849 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.522224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.522267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.522276 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.522292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.522301 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.612707 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:06:58 crc kubenswrapper[5004]: E1203 14:06:58.612832 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.624027 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.624061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.624079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.624096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.624108 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.727105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.727185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.727201 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.727217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.727228 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.829708 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.829749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.829758 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.829774 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.829784 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.932287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.932343 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.932358 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.932379 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.932395 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:58Z","lastTransitionTime":"2025-12-03T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.988971 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm"] Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.989362 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.991554 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 14:06:58 crc kubenswrapper[5004]: I1203 14:06:58.991561 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.002830 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.013256 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.022470 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.025805 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89170a26-1e46-43b1-a994-94a9879d3cf6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.025848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.025890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xw9\" (UniqueName: \"kubernetes.io/projected/89170a26-1e46-43b1-a994-94a9879d3cf6-kube-api-access-g9xw9\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.025925 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.033743 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.034740 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.034803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.034822 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.034848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.034891 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.048597 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.058941 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.071359 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.083066 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.094362 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.106488 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.121826 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.126621 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89170a26-1e46-43b1-a994-94a9879d3cf6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.126881 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.126991 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xw9\" (UniqueName: \"kubernetes.io/projected/89170a26-1e46-43b1-a994-94a9879d3cf6-kube-api-access-g9xw9\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.127144 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.127741 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.127909 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89170a26-1e46-43b1-a994-94a9879d3cf6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.131624 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89170a26-1e46-43b1-a994-94a9879d3cf6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.137252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.137281 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.137291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.137306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.137316 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.138670 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.146474 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xw9\" (UniqueName: \"kubernetes.io/projected/89170a26-1e46-43b1-a994-94a9879d3cf6-kube-api-access-g9xw9\") pod \"ovnkube-control-plane-749d76644c-ff6bm\" (UID: \"89170a26-1e46-43b1-a994-94a9879d3cf6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.150923 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.160708 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.178617 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.200342 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.239697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.239722 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.239730 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.239743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.239751 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.302510 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" Dec 03 14:06:59 crc kubenswrapper[5004]: W1203 14:06:59.316091 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89170a26_1e46_43b1_a994_94a9879d3cf6.slice/crio-b78b66d24572b30bab9345bcffbea9f87234113a1e99857e466bed0fe26bb1ef WatchSource:0}: Error finding container b78b66d24572b30bab9345bcffbea9f87234113a1e99857e466bed0fe26bb1ef: Status 404 returned error can't find the container with id b78b66d24572b30bab9345bcffbea9f87234113a1e99857e466bed0fe26bb1ef Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.342494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.342543 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.342555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.342573 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.342586 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.445539 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.445579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.445589 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.445614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.445625 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.548051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.548082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.548091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.548107 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.548117 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.612398 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:06:59 crc kubenswrapper[5004]: E1203 14:06:59.612652 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.613227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:06:59 crc kubenswrapper[5004]: E1203 14:06:59.613409 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.650852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.650927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.650944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.650963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.650977 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.754333 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.754369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.754378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.754393 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.754401 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.832490 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.845077 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.856481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.856520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.856531 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.856547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.856558 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.857798 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.878213 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.882150 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" event={"ID":"89170a26-1e46-43b1-a994-94a9879d3cf6","Type":"ContainerStarted","Data":"e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.882190 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" event={"ID":"89170a26-1e46-43b1-a994-94a9879d3cf6","Type":"ContainerStarted","Data":"b78b66d24572b30bab9345bcffbea9f87234113a1e99857e466bed0fe26bb1ef"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.891087 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.906210 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.919940 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.930796 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.940960 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.956676 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.958887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.958925 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.958939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.958952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.958962 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:06:59Z","lastTransitionTime":"2025-12-03T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:06:59 crc kubenswrapper[5004]: I1203 14:06:59.974035 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:06:59Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.011626 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.032021 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.061042 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.061076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.061085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.061100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.061112 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.062947 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.082426 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.094516 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.104317 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.163384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.163419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.163431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.163449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.163461 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.266559 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.266656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.266667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.266682 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.266692 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.369750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.369800 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.369812 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.369830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.369843 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.440044 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dgzr8"] Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.440968 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: E1203 14:07:00.441087 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.456615 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.468884 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.472851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.472914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.472935 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.472952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.472963 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.482211 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.492127 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.500667 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.512988 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.527737 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.538298 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.538487 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgt8\" (UniqueName: \"kubernetes.io/projected/54394065-8262-4c2e-abdb-c81b096049ef-kube-api-access-lmgt8\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.542395 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.557668 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.575990 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.576099 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.576394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.576478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.576565 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.576923 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.589397 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.606414 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.612650 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:00 crc kubenswrapper[5004]: E1203 14:07:00.613228 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.627286 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.638774 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.639255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.639292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgt8\" (UniqueName: \"kubernetes.io/projected/54394065-8262-4c2e-abdb-c81b096049ef-kube-api-access-lmgt8\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: E1203 14:07:00.639703 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:00 crc kubenswrapper[5004]: E1203 14:07:00.639942 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:01.139918484 +0000 UTC m=+33.888888720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.657956 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.664869 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgt8\" (UniqueName: \"kubernetes.io/projected/54394065-8262-4c2e-abdb-c81b096049ef-kube-api-access-lmgt8\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.678094 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.680606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.680764 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.680823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.680939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.681041 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.692239 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.783849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.783913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.783926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.783943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.783958 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.886754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.886799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.886810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.886826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.886840 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.891970 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/0.log" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.896051 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7" exitCode=1 Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.896088 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.896748 5004 scope.go:117] "RemoveContainer" containerID="098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.910415 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.926047 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.939795 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.957679 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.977029 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.989100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.989134 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.989145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.989160 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.989171 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:00Z","lastTransitionTime":"2025-12-03T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:00 crc kubenswrapper[5004]: I1203 14:07:00.995429 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:00Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.018078 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.032669 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.045198 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.068210 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:06:59.529596 6267 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 14:06:59.529635 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 14:06:59.529640 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 14:06:59.529651 6267 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 14:06:59.529661 6267 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 14:06:59.529713 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 14:06:59.529722 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:06:59.529728 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 14:06:59.529733 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 14:06:59.529770 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 14:06:59.529774 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:06:59.529843 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:06:59.529869 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:06:59.529885 6267 factory.go:656] Stopping watch factory\\\\nI1203 14:06:59.529896 6267 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.083724 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.092015 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.092090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.092102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.092117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.092127 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.096347 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.107148 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.121703 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.133006 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.143169 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.143713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.143829 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.143899 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:02.143883216 +0000 UTC m=+34.892853452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.156805 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.194361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.194407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.194425 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.194443 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.194454 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.297132 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.297161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.297169 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.297184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.297193 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.346488 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.346640 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:07:17.346614958 +0000 UTC m=+50.095585194 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.399892 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.399938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.399950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.399968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.399979 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.447952 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.448012 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.448042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.448060 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448168 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448182 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448192 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448251 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:17.448234887 +0000 UTC m=+50.197205123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448357 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448465 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:17.448445423 +0000 UTC m=+50.197415709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448524 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448567 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:17.448557446 +0000 UTC m=+50.197527742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448569 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448588 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448601 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.448637 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:17.448624558 +0000 UTC m=+50.197594904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.501976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.502020 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.502036 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.502054 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.502065 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.604773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.604840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.604874 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.605662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.605733 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.615680 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.615811 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.616146 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:01 crc kubenswrapper[5004]: E1203 14:07:01.616199 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.707455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.707487 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.707497 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.707515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.707525 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.810369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.810417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.810428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.810445 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.810458 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.901033 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/0.log" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.903538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.903944 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.905056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" event={"ID":"89170a26-1e46-43b1-a994-94a9879d3cf6","Type":"ContainerStarted","Data":"59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.915963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.916179 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.916266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.916328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.916390 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:01Z","lastTransitionTime":"2025-12-03T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.921898 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.932795 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.944824 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.957993 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.969497 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.982484 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:01 crc kubenswrapper[5004]: I1203 14:07:01.997215 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:01Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.010451 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.018504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.018541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.018551 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.018566 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.018577 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.022635 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.036145 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.047356 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.060368 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.070571 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.090064 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.102219 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.111893 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.120320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.120368 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.120381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.120399 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.120411 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.131480 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:06:59.529596 6267 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 14:06:59.529635 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 14:06:59.529640 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 14:06:59.529651 6267 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 14:06:59.529661 6267 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 14:06:59.529713 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 14:06:59.529722 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:06:59.529728 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 14:06:59.529733 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 14:06:59.529770 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 14:06:59.529774 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:06:59.529843 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:06:59.529869 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:06:59.529885 6267 factory.go:656] Stopping watch factory\\\\nI1203 14:06:59.529896 6267 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.143155 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.153523 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.154734 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:02 crc kubenswrapper[5004]: E1203 14:07:02.154917 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:02 crc kubenswrapper[5004]: E1203 14:07:02.154991 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:04.154973202 +0000 UTC m=+36.903943438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.162807 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.174071 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.185123 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.198383 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.211307 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.223432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.223478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.223493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.223513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.223522 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.226418 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.237002 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.250152 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.267431 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.279829 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.293516 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.312892 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:06:59.529596 6267 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 14:06:59.529635 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 14:06:59.529640 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 14:06:59.529651 6267 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 14:06:59.529661 6267 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 14:06:59.529713 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 14:06:59.529722 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:06:59.529728 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 14:06:59.529733 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 14:06:59.529770 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 14:06:59.529774 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:06:59.529843 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:06:59.529869 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:06:59.529885 6267 factory.go:656] Stopping watch factory\\\\nI1203 14:06:59.529896 6267 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325200 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325283 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.325963 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.337479 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.348817 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.427702 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.427743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.427751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.427769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.427778 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.530001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.530061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.530075 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.530093 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.530115 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.612163 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.612162 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:02 crc kubenswrapper[5004]: E1203 14:07:02.612306 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:02 crc kubenswrapper[5004]: E1203 14:07:02.612604 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.632428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.632682 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.632810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.632903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.632965 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.735710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.735772 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.735787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.735807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.735822 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.838196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.838278 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.838291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.838315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.838331 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.909576 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/1.log" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.910325 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/0.log" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.913303 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0" exitCode=1 Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.913344 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.913416 5004 scope.go:117] "RemoveContainer" containerID="098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.914253 5004 scope.go:117] "RemoveContainer" containerID="d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0" Dec 03 14:07:02 crc kubenswrapper[5004]: E1203 14:07:02.914447 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.933389 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.940548 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.940597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.940608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.940629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.940641 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:02Z","lastTransitionTime":"2025-12-03T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.948441 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.963928 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.979950 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:02 crc kubenswrapper[5004]: I1203 14:07:02.998713 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:02Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.010770 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.029994 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.042458 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.043615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.043673 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.043691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.043716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.043734 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.054378 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.073601 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://098ba4cb8eede1153695853f9b856ca20a57429ba3f7e78f12d0c9f22451bfc7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 14:06:59.529596 6267 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 14:06:59.529635 6267 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 14:06:59.529640 6267 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 14:06:59.529651 6267 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 14:06:59.529661 6267 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 14:06:59.529713 6267 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 14:06:59.529722 6267 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 14:06:59.529728 6267 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 14:06:59.529733 6267 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 14:06:59.529770 6267 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 14:06:59.529774 6267 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 14:06:59.529843 6267 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 14:06:59.529869 6267 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 14:06:59.529885 6267 factory.go:656] Stopping watch factory\\\\nI1203 14:06:59.529896 6267 ovnkube.go:599] Stopped ovnkube\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.085797 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.099743 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.111296 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.122909 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.137985 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.146213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.146260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.146271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.146288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.146302 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.150103 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.162240 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.248567 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.248624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.248637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.248655 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.248668 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.351288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.351369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.351403 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.351433 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.351454 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.453686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.453751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.453766 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.453781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.453793 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.556645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.556686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.556695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.556709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.556720 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.611999 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.612038 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:03 crc kubenswrapper[5004]: E1203 14:07:03.612184 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:03 crc kubenswrapper[5004]: E1203 14:07:03.612272 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.659115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.659447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.659513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.659591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.659665 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.762119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.762190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.762205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.762231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.762247 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.864312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.864337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.864345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.864359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.864385 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.918203 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/1.log" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.921708 5004 scope.go:117] "RemoveContainer" containerID="d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0" Dec 03 14:07:03 crc kubenswrapper[5004]: E1203 14:07:03.921909 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.939512 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.949892 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.958465 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.965989 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.966154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.966247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.966356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.966441 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:03Z","lastTransitionTime":"2025-12-03T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.975711 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:03 crc kubenswrapper[5004]: I1203 14:07:03.989489 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.000496 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:03Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.011368 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.024065 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.034687 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.043616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.055697 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.068896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.068940 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.068949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.068999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.069009 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.069762 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.082472 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.093487 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.104097 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.123196 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.135204 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:04Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.171607 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.171824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.171938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.172005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.172063 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.172497 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:04 crc kubenswrapper[5004]: E1203 14:07:04.172702 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:04 crc kubenswrapper[5004]: E1203 14:07:04.172801 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:08.172783716 +0000 UTC m=+40.921754032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.275018 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.275076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.275099 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.275125 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.275143 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.380574 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.380619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.380630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.380646 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.380656 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.483062 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.483090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.483098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.483110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.483120 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.585523 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.585552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.585561 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.585575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.585584 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.612807 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:04 crc kubenswrapper[5004]: E1203 14:07:04.612944 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.613132 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:04 crc kubenswrapper[5004]: E1203 14:07:04.613309 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.688352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.688392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.688405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.688421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.688431 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.790173 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.790237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.790253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.790271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.790283 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.892126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.892168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.892177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.892192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.892203 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.994564 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.994606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.994618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.994634 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:04 crc kubenswrapper[5004]: I1203 14:07:04.994646 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:04Z","lastTransitionTime":"2025-12-03T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.098032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.098085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.098096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.098112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.098121 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.200918 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.200959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.200970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.200986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.201001 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.303306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.303346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.303356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.303370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.303382 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.406075 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.406118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.406129 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.406167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.406180 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.509086 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.509163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.509205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.509230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.509246 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612043 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: E1203 14:07:05.612189 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612241 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.612235 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: E1203 14:07:05.612443 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.715126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.715180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.715193 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.715210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.715221 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.817432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.817482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.817494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.817513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.817840 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.920765 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.920822 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.920841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.920897 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:05 crc kubenswrapper[5004]: I1203 14:07:05.920915 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:05Z","lastTransitionTime":"2025-12-03T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.022844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.022918 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.022938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.022957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.022968 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.125499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.125564 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.125575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.125601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.125610 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.227837 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.227891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.227902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.227920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.227931 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.329847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.329902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.329913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.329931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.329943 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.432076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.432598 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.433287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.433343 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.433362 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.536206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.536287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.536300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.536319 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.536331 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.611937 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:06 crc kubenswrapper[5004]: E1203 14:07:06.612076 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.612099 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:06 crc kubenswrapper[5004]: E1203 14:07:06.612240 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.639415 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.639455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.639466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.639482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.639497 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.741678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.741716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.741724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.741740 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.741751 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.844398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.844433 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.844445 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.844461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.844471 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.946537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.946588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.946602 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.946621 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:06 crc kubenswrapper[5004]: I1203 14:07:06.946632 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:06Z","lastTransitionTime":"2025-12-03T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.049302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.049377 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.049388 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.049430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.049444 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.151924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.152167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.152241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.152332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.152405 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.254515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.254562 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.254574 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.254611 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.254621 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.356905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.357133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.357199 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.357290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.357344 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.460272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.460300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.460308 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.460336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.460345 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.562499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.562557 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.562570 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.562588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.562601 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.612413 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.612436 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:07 crc kubenswrapper[5004]: E1203 14:07:07.612562 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:07 crc kubenswrapper[5004]: E1203 14:07:07.612672 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.628412 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.639140 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.652853 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665126 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.665528 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.678472 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.691578 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.709930 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.722994 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.733759 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.759373 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.787006 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.798535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.798579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.798589 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.798606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.798619 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.813754 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.826748 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.840818 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.857368 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.869592 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.879404 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:07Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.901535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.901605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.901616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.901631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:07 crc kubenswrapper[5004]: I1203 14:07:07.901643 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:07Z","lastTransitionTime":"2025-12-03T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.004162 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.004478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.004556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.004648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.004717 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.107680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.107730 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.107743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.107781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.107793 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.210453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.210666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.210760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.210836 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.210911 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.216878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.216982 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.217037 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:16.217021596 +0000 UTC m=+48.965991842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.316470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.317301 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.317366 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.317428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.317483 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.419446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.419484 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.419493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.419508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.419519 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.521347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.521396 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.521408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.521427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.521440 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.552106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.552156 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.552171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.552188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.552199 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.566636 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:08Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.569671 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.569698 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.569707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.569721 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.569729 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.582628 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:08Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.586251 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.586290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.586300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.586316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.586326 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.598364 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:08Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.602200 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.602238 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.602247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.602266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.602284 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.612311 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.612392 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.612490 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.612605 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.614804 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:08Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.617381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.617416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.617429 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.617446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.617465 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.628842 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:08Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:08 crc kubenswrapper[5004]: E1203 14:07:08.629275 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.631005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.631095 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.631155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.631217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.631290 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.733641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.733681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.733690 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.733705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.733713 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.835966 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.836002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.836010 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.836022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.836031 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.937907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.937961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.937970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.937983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:08 crc kubenswrapper[5004]: I1203 14:07:08.937992 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:08Z","lastTransitionTime":"2025-12-03T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.040286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.040334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.040346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.040363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.040376 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.142534 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.142573 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.142584 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.142599 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.142613 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.245432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.245672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.245755 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.245852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.245957 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.348630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.348671 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.348682 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.348700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.348710 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.450895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.451191 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.451311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.451419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.451503 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.553969 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.554185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.554266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.554359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.554433 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.612660 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:09 crc kubenswrapper[5004]: E1203 14:07:09.612785 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.612681 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:09 crc kubenswrapper[5004]: E1203 14:07:09.613245 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.656416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.656648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.656761 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.656996 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.657163 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.759613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.759649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.759661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.759678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.759690 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.862557 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.862901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.862986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.863064 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.863144 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.965941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.966016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.966028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.966049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:09 crc kubenswrapper[5004]: I1203 14:07:09.966059 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:09Z","lastTransitionTime":"2025-12-03T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.069870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.069899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.069907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.069920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.069929 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.172623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.172661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.172670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.172685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.172696 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.274889 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.274919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.274927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.274944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.274952 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.377073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.377115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.377128 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.377144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.377156 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.479656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.479695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.479706 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.479722 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.479733 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.582240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.582328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.582345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.582362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.582385 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.612209 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:10 crc kubenswrapper[5004]: E1203 14:07:10.612357 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.612227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:10 crc kubenswrapper[5004]: E1203 14:07:10.612555 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.685535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.685575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.685586 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.685602 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.685614 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.788114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.788165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.788177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.788194 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.788205 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.890496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.890542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.890551 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.890568 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.890578 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.992429 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.992473 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.992487 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.992504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:10 crc kubenswrapper[5004]: I1203 14:07:10.992516 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:10Z","lastTransitionTime":"2025-12-03T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.094945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.094983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.094994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.095013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.095024 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.198698 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.198742 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.198753 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.198802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.198817 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.302225 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.302283 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.302296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.302316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.302328 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.405163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.405220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.405230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.405253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.405267 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.508108 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.508457 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.508614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.508726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.508792 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611232 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611261 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611271 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.611966 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:11 crc kubenswrapper[5004]: E1203 14:07:11.612056 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.612216 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:11 crc kubenswrapper[5004]: E1203 14:07:11.612439 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.713357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.713663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.713745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.713815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.713910 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.816187 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.816223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.816233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.816246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.816257 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.918590 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.918640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.918649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.918665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:11 crc kubenswrapper[5004]: I1203 14:07:11.918674 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:11Z","lastTransitionTime":"2025-12-03T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.020450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.020709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.020798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.020928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.021044 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.123978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.124049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.124063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.124079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.124089 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.226427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.226727 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.226813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.226934 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.227029 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.329756 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.329786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.329799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.329814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.329824 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.432114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.432154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.432168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.432186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.432199 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.534840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.534915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.534931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.534954 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.534969 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.612630 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.612664 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:12 crc kubenswrapper[5004]: E1203 14:07:12.612752 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:12 crc kubenswrapper[5004]: E1203 14:07:12.612845 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.636843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.636929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.636950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.636978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.636993 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.739337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.739387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.739401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.739417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.739426 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.841868 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.841903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.841913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.841926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.841935 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.948496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.948582 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.948605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.948632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:12 crc kubenswrapper[5004]: I1203 14:07:12.948658 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:12Z","lastTransitionTime":"2025-12-03T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.051627 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.051675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.051687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.051704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.051716 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.154747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.154803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.154815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.154832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.154845 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.257144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.257185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.257196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.257212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.257223 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.359902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.360170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.360237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.360311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.360375 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.463147 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.463185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.463226 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.463243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.463260 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.565620 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.565929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.566011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.566110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.566205 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.612197 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.612202 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:13 crc kubenswrapper[5004]: E1203 14:07:13.612421 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:13 crc kubenswrapper[5004]: E1203 14:07:13.612519 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.668882 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.669159 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.669230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.669314 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.669385 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.771981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.772262 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.772337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.772403 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.772460 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.874666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.874729 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.874746 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.874773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.874793 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.976842 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.976925 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.976949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.976979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:13 crc kubenswrapper[5004]: I1203 14:07:13.977001 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:13Z","lastTransitionTime":"2025-12-03T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.079348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.079410 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.079426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.079449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.079467 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.182629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.182689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.182707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.182733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.182757 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.285824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.285908 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.285930 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.285957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.285980 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.388454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.388506 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.388523 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.388545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.388560 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.491976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.492024 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.492036 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.492055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.492068 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.593603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.593648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.593665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.593680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.593689 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.612313 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.612329 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:14 crc kubenswrapper[5004]: E1203 14:07:14.612628 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:14 crc kubenswrapper[5004]: E1203 14:07:14.612472 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.696577 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.696617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.696626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.696642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.696651 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.798284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.798317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.798325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.798339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.798347 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.901096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.901141 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.901151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.901168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:14 crc kubenswrapper[5004]: I1203 14:07:14.901180 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:14Z","lastTransitionTime":"2025-12-03T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.003539 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.003840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.003961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.004049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.004127 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.106941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.106993 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.107007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.107028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.107044 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.209684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.209969 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.210061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.210148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.210227 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.313951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.314003 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.314017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.314038 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.314047 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.416574 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.416612 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.416623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.416637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.416649 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.519402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.519465 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.519484 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.519547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.519565 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.612742 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:15 crc kubenswrapper[5004]: E1203 14:07:15.612997 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.613713 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:15 crc kubenswrapper[5004]: E1203 14:07:15.613982 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.615502 5004 scope.go:117] "RemoveContainer" containerID="d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.622430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.622464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.622476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.622493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.622509 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.724614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.724647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.724657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.724674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.724685 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.827236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.827273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.827283 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.827298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.827308 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.929363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.929400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.929411 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.929428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.929440 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:15Z","lastTransitionTime":"2025-12-03T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.964225 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/1.log" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.968417 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e"} Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.969055 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:07:15 crc kubenswrapper[5004]: I1203 14:07:15.989391 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:15Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.005183 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.022986 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.031188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.031221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.031232 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.031245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.031254 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.044275 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.061742 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.074508 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.093556 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.104564 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.113231 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.130309 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.133986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.134017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.134026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.134039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.134050 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.143970 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.155445 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.167410 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.183661 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.198728 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.213294 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.229054 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.236788 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.236831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.236841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.236872 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.236887 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.301435 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:16 crc kubenswrapper[5004]: E1203 14:07:16.301580 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:16 crc kubenswrapper[5004]: E1203 14:07:16.301635 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:07:32.301620744 +0000 UTC m=+65.050590980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.338959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.339002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.339011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.339042 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.339052 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.441208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.441257 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.441265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.441279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.441288 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.544394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.544447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.544468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.544504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.544530 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.612295 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.612366 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:16 crc kubenswrapper[5004]: E1203 14:07:16.612449 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:16 crc kubenswrapper[5004]: E1203 14:07:16.613291 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.646617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.646658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.646669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.646686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.646696 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.709392 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.722337 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.727372 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.734770 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.746908 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.750146 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.750192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.750213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.750233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.750246 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.759637 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.771322 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.782788 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.791638 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.805160 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.815458 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.828408 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.841142 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.853285 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.855525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.855566 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.855651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.855675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.855689 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.865085 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.885830 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.898891 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.909603 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.929062 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.958685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.958744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.958754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.958768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.958780 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:16Z","lastTransitionTime":"2025-12-03T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.974012 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/2.log" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.975081 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/1.log" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.978577 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" exitCode=1 Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.978649 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e"} Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.978742 5004 scope.go:117] "RemoveContainer" containerID="d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0" Dec 03 14:07:16 crc kubenswrapper[5004]: I1203 14:07:16.980508 5004 scope.go:117] "RemoveContainer" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" Dec 03 14:07:16 crc kubenswrapper[5004]: E1203 14:07:16.980954 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.000180 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.015306 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.031003 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.048326 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.060203 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.062057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.062106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.062118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.062138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.062163 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.070755 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.080276 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.091145 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.104288 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.116346 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.128073 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.138175 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.151048 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.161230 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.166686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.166743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.166761 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.166782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.166801 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.183219 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.194406 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.203902 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.218878 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.270320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.270354 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.270361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.270375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.270394 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.373753 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.374051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.374063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.374091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.374104 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.411499 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.411680 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:07:49.411646911 +0000 UTC m=+82.160617187 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.476973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.477045 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.477066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.477097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.477118 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.512627 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.512687 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.512715 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.512748 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.512916 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.512922 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.512981 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:49.51295988 +0000 UTC m=+82.261930126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.512954 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513024 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513066 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:49.513053753 +0000 UTC m=+82.262024009 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.512926 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513090 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513102 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513159 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:49.513144206 +0000 UTC m=+82.262114442 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513665 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.513773 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:07:49.513747833 +0000 UTC m=+82.262718099 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.579943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.579982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.579993 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.580011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.580022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.612381 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.612560 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.612588 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.612780 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.623617 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.637629 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.650513 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.661476 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.674463 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.682001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.682043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.682060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.682082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.682098 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.688626 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.699162 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.711057 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.721733 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.733757 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.749363 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d988ce69d01a2370208af8fe8e90ab1a677bbb56988ae048f7424cea5403c6a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:02Z\\\",\\\"message\\\":\\\"ift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1203 14:07:01.999183 6470 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999190 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1203 14:07:01.999195 6470 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm\\\\nI1203 14:07:01.999198 6470 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 14:07:01.999208 6470 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm in node crc\\\\nI1203 14:07:01.999216 6470 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm after 0 failed attempt(s)\\\\nF1203 14:07:01.999223 6470 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.767174 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.779071 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.784415 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.784454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.784466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.784485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.784497 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.788302 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.799178 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.809837 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.820281 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.830526 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:17Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.886349 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.886391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.886399 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.886413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.886423 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.982781 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/2.log" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.985769 5004 scope.go:117] "RemoveContainer" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" Dec 03 14:07:17 crc kubenswrapper[5004]: E1203 14:07:17.986088 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.988008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.988049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.988061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.988077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:17 crc kubenswrapper[5004]: I1203 14:07:17.988089 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:17Z","lastTransitionTime":"2025-12-03T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.011190 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.023164 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.036945 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.054205 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.065422 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.077359 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090151 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090459 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.090470 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.101083 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.113707 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.126807 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.136395 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.148354 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.157275 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.168947 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.180062 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.191950 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.192189 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.192213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.192223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.192239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.192250 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.202146 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.216354 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.295133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.295176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.295208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.295224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.295234 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.397724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.397762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.397771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.397790 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.397800 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.500231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.500289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.500297 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.500312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.500320 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.602214 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.602263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.602275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.602293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.602306 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.612811 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.612883 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:18 crc kubenswrapper[5004]: E1203 14:07:18.612959 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:18 crc kubenswrapper[5004]: E1203 14:07:18.613045 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.704494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.704532 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.704541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.704556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.704565 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.806949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.807015 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.807033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.807058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.807075 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.910103 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.910157 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.910170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.910195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.910219 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.984031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.984084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.984095 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.984113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:18 crc kubenswrapper[5004]: I1203 14:07:18.984124 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:18Z","lastTransitionTime":"2025-12-03T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.001565 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:18Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.006149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.006205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.006222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.006246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.006266 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.023318 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:19Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.027164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.027201 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.027210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.027224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.027234 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.042563 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:19Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.045957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.045995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.046005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.046036 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.046045 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.066898 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:19Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.070437 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.070476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.070488 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.070504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.070515 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.081054 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:19Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.081175 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.082685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.082747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.082768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.082792 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.082809 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.185632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.185670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.185681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.185697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.185707 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.289234 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.289288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.289303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.289326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.289342 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.391445 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.391495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.391505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.391519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.391529 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.494181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.494219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.494229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.494246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.494258 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.595998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.596033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.596047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.596062 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.596073 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.612635 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.612768 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.612903 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:19 crc kubenswrapper[5004]: E1203 14:07:19.613029 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.698196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.698237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.698272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.698289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.698298 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.800781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.800845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.800895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.800927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.800952 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.902898 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.902933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.902944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.902967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:19 crc kubenswrapper[5004]: I1203 14:07:19.902981 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:19Z","lastTransitionTime":"2025-12-03T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.008160 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.008565 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.008585 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.008611 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.008628 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.110350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.110384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.110392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.110405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.110415 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.212310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.212347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.212363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.212382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.212394 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.314725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.314763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.314773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.314788 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.314799 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.418251 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.418289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.418298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.418313 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.418324 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.520500 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.520554 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.520565 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.520582 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.520593 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.612415 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.612482 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:20 crc kubenswrapper[5004]: E1203 14:07:20.612589 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:20 crc kubenswrapper[5004]: E1203 14:07:20.612661 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.622846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.622908 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.622919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.622939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.622995 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.725026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.725067 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.725078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.725094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.725105 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.827282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.827322 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.827345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.827361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.827372 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.929395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.929435 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.929450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.929494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:20 crc kubenswrapper[5004]: I1203 14:07:20.929507 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:20Z","lastTransitionTime":"2025-12-03T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.031272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.031326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.031339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.031392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.031406 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.134164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.134207 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.134218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.134240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.134249 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.237156 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.237461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.237549 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.237669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.237732 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.340463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.340501 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.340511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.340527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.340537 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.443047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.443081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.443090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.443104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.443113 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.546901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.547278 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.547485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.547700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.547936 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.612770 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:21 crc kubenswrapper[5004]: E1203 14:07:21.613108 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.612808 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:21 crc kubenswrapper[5004]: E1203 14:07:21.613935 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.650468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.650533 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.650551 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.650575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.650593 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.752775 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.752849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.752887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.752907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.752930 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.855356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.855608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.855700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.855769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.855829 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.958733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.958791 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.958807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.958830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:21 crc kubenswrapper[5004]: I1203 14:07:21.958845 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:21Z","lastTransitionTime":"2025-12-03T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.061470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.061739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.061828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.061972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.062079 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.166065 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.166409 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.166515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.166617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.166740 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.276032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.276111 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.276123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.276143 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.276154 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.378382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.378659 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.378727 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.378789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.378844 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.481923 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.481969 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.481980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.481995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.482004 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.584387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.584425 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.584438 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.584454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.584464 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.611940 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:22 crc kubenswrapper[5004]: E1203 14:07:22.612093 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.612136 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:22 crc kubenswrapper[5004]: E1203 14:07:22.612272 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.687346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.687421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.687431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.687444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.687453 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.789772 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.789819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.789833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.789869 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.789879 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.892137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.892185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.892199 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.892215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.892226 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.994843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.994915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.994927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.994943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:22 crc kubenswrapper[5004]: I1203 14:07:22.994957 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:22Z","lastTransitionTime":"2025-12-03T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.098299 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.098371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.098384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.098401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.098414 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.201253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.201306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.201320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.201344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.201356 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.303947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.304026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.304051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.304081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.304102 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.407805 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.407924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.407951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.407984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.408005 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.510325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.510366 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.510375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.510396 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.510406 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.612697 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.612758 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:23 crc kubenswrapper[5004]: E1203 14:07:23.612830 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:23 crc kubenswrapper[5004]: E1203 14:07:23.613013 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.616084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.616136 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.616149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.616168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.616192 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.720423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.720461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.720471 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.720488 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.720498 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.823481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.823527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.823537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.823552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.823565 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.927631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.927677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.927689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.927706 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:23 crc kubenswrapper[5004]: I1203 14:07:23.927717 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:23Z","lastTransitionTime":"2025-12-03T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.029917 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.029957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.029969 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.029982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.029996 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.133139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.133198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.133210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.133230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.133243 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.235413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.235471 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.235484 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.235502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.235515 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.346662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.346696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.346745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.346764 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.346774 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.449608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.449655 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.449666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.449684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.449696 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.552524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.552589 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.552604 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.552625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.552638 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.612741 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.612784 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:24 crc kubenswrapper[5004]: E1203 14:07:24.612922 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:24 crc kubenswrapper[5004]: E1203 14:07:24.612997 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.654703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.654748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.654763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.654782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.654794 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.757852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.757972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.757996 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.758027 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.758051 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.860469 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.860514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.860527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.860546 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.860559 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.963063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.963115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.963127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.963145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:24 crc kubenswrapper[5004]: I1203 14:07:24.963157 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:24Z","lastTransitionTime":"2025-12-03T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.065270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.065321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.065332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.065347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.065357 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.167744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.167789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.167799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.167814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.167822 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.269964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.270005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.270014 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.270030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.270042 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.372688 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.372738 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.372749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.372767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.372783 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.475621 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.475670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.475686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.475707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.475723 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.578824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.578884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.578893 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.578909 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.578919 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.612731 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.612873 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:25 crc kubenswrapper[5004]: E1203 14:07:25.613028 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:25 crc kubenswrapper[5004]: E1203 14:07:25.613212 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.681332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.681386 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.681397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.681412 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.681423 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.783891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.783958 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.783981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.784011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.784034 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.886976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.887039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.887051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.887069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.887081 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.989561 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.989625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.989643 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.989668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:25 crc kubenswrapper[5004]: I1203 14:07:25.989690 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:25Z","lastTransitionTime":"2025-12-03T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.091979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.092047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.092065 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.092092 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.092109 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.195001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.195048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.195063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.195084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.195099 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.296924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.296975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.296986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.297003 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.297014 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.399977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.400023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.400034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.400050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.400060 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.502616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.502670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.502686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.502706 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.502720 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.605652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.605982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.606095 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.606206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.606299 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.612101 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:26 crc kubenswrapper[5004]: E1203 14:07:26.612205 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.612508 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:26 crc kubenswrapper[5004]: E1203 14:07:26.612781 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.708881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.708915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.708925 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.708939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.708948 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.811041 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.811087 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.811098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.811118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.811131 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.913469 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.913508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.913519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.913535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:26 crc kubenswrapper[5004]: I1203 14:07:26.913545 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:26Z","lastTransitionTime":"2025-12-03T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.019253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.019289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.019316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.019332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.019344 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.121114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.121148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.121161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.121175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.121184 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.224438 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.224494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.224505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.224527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.224544 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.326620 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.326668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.326680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.326695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.326707 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.429322 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.429391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.429413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.429444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.429466 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.531653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.531693 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.531703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.531718 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.531728 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.611963 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.611979 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:27 crc kubenswrapper[5004]: E1203 14:07:27.612073 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:27 crc kubenswrapper[5004]: E1203 14:07:27.612190 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.633330 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.633365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.633376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.633391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.633402 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.635395 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.647007 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.657921 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.690563 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.703321 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.714911 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.727600 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.736070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.736100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.736110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.736126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.736136 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.738423 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.750165 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.760119 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.768340 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.781959 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.792948 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.806269 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.819310 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.832635 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.838096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.838135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.838149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.838164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.838173 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.843998 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.856313 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:27Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.939948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.940289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.940298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.940313 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:27 crc kubenswrapper[5004]: I1203 14:07:27.940323 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:27Z","lastTransitionTime":"2025-12-03T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.043288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.043338 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.043354 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.043377 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.043391 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.146182 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.146219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.146231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.146248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.146260 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.249100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.249176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.249190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.249209 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.249227 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.351554 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.351610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.351624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.351645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.351658 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.453799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.453851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.453892 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.453916 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.453932 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.556591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.556645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.556657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.556677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.556690 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.612631 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.612741 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:28 crc kubenswrapper[5004]: E1203 14:07:28.612763 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:28 crc kubenswrapper[5004]: E1203 14:07:28.612935 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.658823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.658875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.658887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.658905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.658919 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.761210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.761248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.761258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.761275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.761287 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.864805 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.864842 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.864868 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.864888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.864904 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.967845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.967986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.968040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.968058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:28 crc kubenswrapper[5004]: I1203 14:07:28.968069 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:28Z","lastTransitionTime":"2025-12-03T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.070716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.070767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.070781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.070798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.070812 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.173637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.173692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.173703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.173721 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.173731 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.276646 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.276686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.276699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.276716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.276727 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.378811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.378847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.378872 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.378888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.378900 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.387103 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.387138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.387161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.387179 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.387193 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.405000 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.423004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.423058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.423077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.423100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.423118 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.436051 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.438944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.438974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.438985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.438999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.439012 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.452533 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.457082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.457112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.457120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.457135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.457145 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.472201 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.475836 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.475905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.475919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.475937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.475949 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.489187 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:29Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.489302 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.490597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.490628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.490637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.490659 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.490668 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.593236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.593277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.593288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.593305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.593318 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.612641 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.612791 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.613050 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:29 crc kubenswrapper[5004]: E1203 14:07:29.613120 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.695875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.695913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.695922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.695961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.695972 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.801001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.801071 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.801089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.801118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.801134 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.903798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.903827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.903835 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.903848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:29 crc kubenswrapper[5004]: I1203 14:07:29.903870 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:29Z","lastTransitionTime":"2025-12-03T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.006524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.006710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.006721 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.006739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.006748 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.108436 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.108468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.108480 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.108524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.108535 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.210663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.210692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.210700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.210713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.210722 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.313274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.313344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.313360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.313382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.313398 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.415058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.415105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.415118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.415136 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.415150 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.518068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.518116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.518127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.518143 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.518152 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.612808 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.612915 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:30 crc kubenswrapper[5004]: E1203 14:07:30.613005 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:30 crc kubenswrapper[5004]: E1203 14:07:30.613238 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.614035 5004 scope.go:117] "RemoveContainer" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" Dec 03 14:07:30 crc kubenswrapper[5004]: E1203 14:07:30.614260 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.620876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.620904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.620916 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.620933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.620947 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.724023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.724085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.724097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.724112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.724149 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.826416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.826467 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.826482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.826504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.826518 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.929301 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.929400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.929414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.929438 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:30 crc kubenswrapper[5004]: I1203 14:07:30.929452 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:30Z","lastTransitionTime":"2025-12-03T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.031895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.031950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.031961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.031976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.031986 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.134984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.135019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.135032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.135048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.135060 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.237408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.237443 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.237452 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.237465 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.237475 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.340025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.340068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.340077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.340094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.340104 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.447769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.447814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.447827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.447845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.447872 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.550207 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.550248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.550260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.550278 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.550290 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.611843 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.611916 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:31 crc kubenswrapper[5004]: E1203 14:07:31.611993 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:31 crc kubenswrapper[5004]: E1203 14:07:31.612167 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.654481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.654530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.654543 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.654568 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.654580 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.757263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.757315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.757331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.757347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.757359 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.859628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.859663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.859675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.859692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.859704 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.962236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.962284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.962297 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.962314 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:31 crc kubenswrapper[5004]: I1203 14:07:31.962326 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:31Z","lastTransitionTime":"2025-12-03T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.064942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.064979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.064988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.065005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.065015 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.167364 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.167427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.167436 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.167453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.167463 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.269876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.269924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.269936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.269954 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.269968 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.361218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:32 crc kubenswrapper[5004]: E1203 14:07:32.361392 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:32 crc kubenswrapper[5004]: E1203 14:07:32.361489 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:08:04.361467644 +0000 UTC m=+97.110437930 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.372555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.372584 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.372594 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.372607 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.372616 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.475804 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.475834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.475843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.475870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.475879 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.577919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.577988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.578001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.578016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.578028 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.612389 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.612408 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:32 crc kubenswrapper[5004]: E1203 14:07:32.612528 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:32 crc kubenswrapper[5004]: E1203 14:07:32.612606 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.680138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.680188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.680229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.680250 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.680266 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.782586 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.782629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.782641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.782658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.782667 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.884829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.884904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.884919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.884939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.884951 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.987118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.987155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.987166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.987181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:32 crc kubenswrapper[5004]: I1203 14:07:32.987192 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:32Z","lastTransitionTime":"2025-12-03T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.089826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.089891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.089900 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.089916 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.089926 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.191575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.191625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.191636 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.191653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.191665 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.294990 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.295030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.295040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.295055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.295068 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.397651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.397695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.397709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.397729 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.397740 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.500378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.500407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.500418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.500432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.500442 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.603193 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.603235 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.603246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.603265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.603277 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.611957 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.612039 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:33 crc kubenswrapper[5004]: E1203 14:07:33.612160 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:33 crc kubenswrapper[5004]: E1203 14:07:33.612237 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.706058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.706114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.706128 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.706145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.706184 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.808181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.808214 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.808222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.808236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.808246 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.910068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.910094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.910103 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.910118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:33 crc kubenswrapper[5004]: I1203 14:07:33.910127 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:33Z","lastTransitionTime":"2025-12-03T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.012346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.012397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.012408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.012424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.012753 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.115833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.115899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.115911 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.115929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.115940 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.218467 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.218502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.218512 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.218526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.218535 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.320954 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.321013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.321025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.321040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.321051 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.422972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.423018 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.423029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.423043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.423053 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.526079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.526125 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.526137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.526155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.526168 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.612362 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:34 crc kubenswrapper[5004]: E1203 14:07:34.612492 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.612500 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:34 crc kubenswrapper[5004]: E1203 14:07:34.612684 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.628026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.628071 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.628080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.628115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.628125 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.733031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.733083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.733098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.733114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.733124 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.835409 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.835458 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.835474 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.835491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.835503 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.937529 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.937572 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.937606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.937623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:34 crc kubenswrapper[5004]: I1203 14:07:34.937635 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:34Z","lastTransitionTime":"2025-12-03T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.040139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.040188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.040202 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.040239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.040554 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.042914 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/0.log" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.042958 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff08cd56-3e47-4cd7-98ad-8571f178dc62" containerID="76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e" exitCode=1 Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.042997 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerDied","Data":"76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.043445 5004 scope.go:117] "RemoveContainer" containerID="76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.054648 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.072402 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.092179 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.103882 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.119305 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.132593 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.142540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.142588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.142601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.142618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.142630 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.145721 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.157729 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.169741 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.178603 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.190457 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.201811 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.214991 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.225548 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.238239 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.245046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.245079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.245091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.245108 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.245120 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.247796 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.260562 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.271684 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:35Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.346794 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.346834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.346844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.346878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.346891 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.448799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.448851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.448881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.448897 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.448908 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.551306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.551371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.551384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.551400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.551410 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.612840 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.612850 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:35 crc kubenswrapper[5004]: E1203 14:07:35.613046 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:35 crc kubenswrapper[5004]: E1203 14:07:35.613096 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.652893 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.652919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.652927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.652939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.652948 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.755089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.755117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.755126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.755140 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.755150 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.857327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.857357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.857365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.857378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.857386 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.959980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.960017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.960029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.960044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:35 crc kubenswrapper[5004]: I1203 14:07:35.960055 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:35Z","lastTransitionTime":"2025-12-03T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.047211 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/0.log" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.047270 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerStarted","Data":"70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.062894 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.062929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.062940 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.062958 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.062971 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.073804 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.090507 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.103503 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.121607 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.134201 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.146929 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.158063 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.165533 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.165567 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.165576 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.165592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.165602 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.167704 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.177750 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.189887 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.199040 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.209253 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.218492 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.233632 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.245732 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.258024 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.267929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.267960 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.267970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.267987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.267998 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.268449 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.282714 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:36Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.370096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.370166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.370178 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.370197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.370207 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.472988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.473034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.473049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.473066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.473078 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.575447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.575510 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.575528 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.575547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.575560 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.612207 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:36 crc kubenswrapper[5004]: E1203 14:07:36.612358 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.612213 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:36 crc kubenswrapper[5004]: E1203 14:07:36.612532 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.678556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.678598 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.678609 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.678625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.678635 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.780255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.780297 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.780309 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.780325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.780335 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.882928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.882962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.882970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.882984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.882995 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.985016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.985061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.985075 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.985094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:36 crc kubenswrapper[5004]: I1203 14:07:36.985105 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:36Z","lastTransitionTime":"2025-12-03T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.087376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.087440 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.087454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.087470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.087482 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.189597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.189644 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.189656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.189678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.189692 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.291935 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.291968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.291978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.291993 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.292005 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.394701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.394794 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.394810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.394840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.394888 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.500647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.500724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.500744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.500773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.500793 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.603706 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.603757 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.603771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.603788 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.603797 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.612121 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.612152 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:37 crc kubenswrapper[5004]: E1203 14:07:37.612237 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:37 crc kubenswrapper[5004]: E1203 14:07:37.612339 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.624609 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.635999 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.654587 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.675457 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.691788 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.703803 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.705433 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.705562 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.705579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.705595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.705624 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.713030 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.722448 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.733633 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.743720 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.752394 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.764025 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.775543 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.787330 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.797317 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.807787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.807831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.807844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.807876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.807895 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.810458 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.821131 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.834226 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:37Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.909973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.910002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.910012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.910025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:37 crc kubenswrapper[5004]: I1203 14:07:37.910034 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:37Z","lastTransitionTime":"2025-12-03T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.013070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.013110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.013119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.013133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.013145 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.115482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.115530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.115543 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.115561 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.115590 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.217283 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.217321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.217332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.217348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.217359 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.320242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.320286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.320296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.320316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.320329 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.422585 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.422645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.422662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.422689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.422707 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.525639 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.525685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.525694 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.525710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.525719 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.612767 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.612826 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:38 crc kubenswrapper[5004]: E1203 14:07:38.612935 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:38 crc kubenswrapper[5004]: E1203 14:07:38.613025 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.627429 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.627464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.627474 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.627489 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.627500 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.730153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.730188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.730196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.730210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.730218 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.833414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.833464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.833479 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.833497 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.833509 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.936423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.936471 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.936484 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.936502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:38 crc kubenswrapper[5004]: I1203 14:07:38.936514 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:38Z","lastTransitionTime":"2025-12-03T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.039236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.039279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.039289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.039308 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.039320 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.141547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.141588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.141600 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.141617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.141627 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.244516 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.244564 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.244576 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.244597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.244609 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.347167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.347198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.347208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.347223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.347234 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.449141 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.449468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.449571 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.449672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.449757 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.552077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.552302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.552400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.552467 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.552528 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.612656 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.612673 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.612882 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.612940 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.654609 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.654851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.654959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.655112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.655214 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.758115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.758552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.758663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.758781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.758919 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.861354 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.861392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.861401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.861418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.861427 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.887385 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.887451 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.887464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.887480 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.887492 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.900297 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.903610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.903640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.903648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.903663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.903674 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.916827 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.920849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.920929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.920943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.920960 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.920970 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.936840 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.940783 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.940825 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.940837 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.940872 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.940887 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.952436 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.956529 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.956563 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.956573 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.956591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.956604 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.968997 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:39Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:39 crc kubenswrapper[5004]: E1203 14:07:39.969108 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.970376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.970393 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.970403 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.970416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:39 crc kubenswrapper[5004]: I1203 14:07:39.970426 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:39Z","lastTransitionTime":"2025-12-03T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.073031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.073080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.073105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.073123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.073135 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.175461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.175509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.175521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.175538 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.175549 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.277928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.277973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.277985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.278005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.278020 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.379712 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.379758 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.379769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.379786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.379797 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.482187 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.482246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.482270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.482302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.482327 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.585414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.585447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.585456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.585470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.585479 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.612575 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.612593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:40 crc kubenswrapper[5004]: E1203 14:07:40.612758 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:40 crc kubenswrapper[5004]: E1203 14:07:40.612880 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.689412 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.689563 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.689581 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.689601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.689612 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.792034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.792097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.792109 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.792153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.792167 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.895176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.895230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.895253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.895280 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.895301 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.998432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.998468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.998478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.998494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:40 crc kubenswrapper[5004]: I1203 14:07:40.998505 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:40Z","lastTransitionTime":"2025-12-03T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.101108 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.101142 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.101153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.101166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.101176 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.203486 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.203525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.203533 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.203552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.203563 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.306052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.306515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.306726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.306965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.307164 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.409787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.410221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.410364 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.410470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.410562 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.513363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.513396 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.513413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.513430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.513441 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.613050 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:41 crc kubenswrapper[5004]: E1203 14:07:41.613229 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.613050 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:41 crc kubenswrapper[5004]: E1203 14:07:41.613713 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.614010 5004 scope.go:117] "RemoveContainer" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.616414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.616637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.616727 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.616850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.617180 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.720073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.720520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.720537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.720557 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.720572 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.823168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.823202 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.823210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.823224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.823233 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.925686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.925744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.925754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.925770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:41 crc kubenswrapper[5004]: I1203 14:07:41.925781 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:41Z","lastTransitionTime":"2025-12-03T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.027988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.028029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.028040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.028055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.028068 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.130337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.130365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.130374 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.130391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.130402 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.233486 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.233520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.233531 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.233550 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.233561 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.356265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.356304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.356315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.356332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.356344 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.458465 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.458509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.458519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.458537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.458548 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.560503 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.560542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.560555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.560570 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.560583 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.612283 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:42 crc kubenswrapper[5004]: E1203 14:07:42.612398 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.612298 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:42 crc kubenswrapper[5004]: E1203 14:07:42.612471 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.663312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.663347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.663356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.663370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.663399 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.766137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.766179 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.766192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.766210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.766223 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.868315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.868351 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.868361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.868376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.868385 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.971100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.971134 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.971148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.971164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:42 crc kubenswrapper[5004]: I1203 14:07:42.971174 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:42Z","lastTransitionTime":"2025-12-03T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.069551 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/3.log" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.070677 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/2.log" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.072705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.072783 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.072968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.073012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.073030 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.076778 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" exitCode=1 Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.076824 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.076880 5004 scope.go:117] "RemoveContainer" containerID="897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.078175 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:07:43 crc kubenswrapper[5004]: E1203 14:07:43.078445 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.092727 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.107112 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.119622 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.133683 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.146783 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.158768 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.176452 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.176491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.176502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.176519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.176532 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.177372 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:43Z\\\",\\\"message\\\":\\\"7:42.719479 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:42Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:42.719491 6987 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719510 6987 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719521 6987 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1203 14:07:42.718795 6987 services_controller.go:451] Built service openshift-kube-sc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.196567 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.211258 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.222603 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.235298 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.248423 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.264165 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.278633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.278688 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.278699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.278714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.278728 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.279922 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.293664 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.311760 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.326263 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.340077 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:43Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.381190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.381251 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.381260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.381275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.381285 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.484297 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.484363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.484381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.484405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.484427 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.587696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.587779 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.587797 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.587821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.587836 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.612520 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.612652 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:43 crc kubenswrapper[5004]: E1203 14:07:43.612728 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:43 crc kubenswrapper[5004]: E1203 14:07:43.612854 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.691081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.691145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.691157 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.691178 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.691194 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.794760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.794809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.794819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.794838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.794851 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.897227 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.897268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.897277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.897291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:43 crc kubenswrapper[5004]: I1203 14:07:43.897303 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:43Z","lastTransitionTime":"2025-12-03T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.000113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.000165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.000175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.000192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.000203 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.082160 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/3.log" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.103012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.103065 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.103078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.103096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.103108 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.205957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.206005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.206017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.206035 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.206048 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.309813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.309907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.309920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.309936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.310274 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.413089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.413131 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.413142 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.413164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.413177 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.515428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.515504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.515527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.515556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.515579 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.612601 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.612671 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:44 crc kubenswrapper[5004]: E1203 14:07:44.612747 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:44 crc kubenswrapper[5004]: E1203 14:07:44.612937 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.618372 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.618404 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.618413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.618426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.618436 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.721401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.721460 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.721474 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.721499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.721519 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.825394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.825434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.825444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.825462 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.825476 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.928176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.928211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.928219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.928233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:44 crc kubenswrapper[5004]: I1203 14:07:44.928242 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:44Z","lastTransitionTime":"2025-12-03T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.030097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.030171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.030188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.030212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.030229 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.133063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.133176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.133198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.133225 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.133244 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.235566 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.235607 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.235622 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.235641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.235652 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.337434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.337496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.337508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.337527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.337538 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.440348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.440390 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.440402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.440417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.440428 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.542422 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.542466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.542477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.542493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.542504 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.612883 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.612958 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:45 crc kubenswrapper[5004]: E1203 14:07:45.613022 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:45 crc kubenswrapper[5004]: E1203 14:07:45.613069 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.644558 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.644606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.644626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.644642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.644655 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.747461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.747504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.747512 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.747527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.747537 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.850815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.850851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.850883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.850901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.850913 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.953932 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.954000 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.954023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.954053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:45 crc kubenswrapper[5004]: I1203 14:07:45.954076 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:45Z","lastTransitionTime":"2025-12-03T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.056631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.056679 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.056696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.056713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.056723 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.159846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.159916 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.159929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.159948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.159961 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.262272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.262311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.262323 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.262339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.262351 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.365252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.365309 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.365324 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.365344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.365360 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.468674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.468704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.468713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.468725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.468734 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.570733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.570788 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.570801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.570817 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.570827 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.611901 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.611919 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:46 crc kubenswrapper[5004]: E1203 14:07:46.612093 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:46 crc kubenswrapper[5004]: E1203 14:07:46.612172 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.674191 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.674256 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.674278 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.674309 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.674330 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.776802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.776888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.776921 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.776937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.776948 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.880070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.880139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.880153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.880170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.880183 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.982569 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.982613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.982623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.982641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:46 crc kubenswrapper[5004]: I1203 14:07:46.982654 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:46Z","lastTransitionTime":"2025-12-03T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.085778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.085828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.085837 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.085852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.085890 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.189141 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.189175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.189186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.189203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.189215 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.292488 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.292527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.292539 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.292555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.292567 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.396311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.396358 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.396375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.396398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.396416 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.499529 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.499604 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.499623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.499649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.499671 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.602530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.602576 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.602591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.602611 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.602626 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.612258 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:47 crc kubenswrapper[5004]: E1203 14:07:47.612359 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.612629 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:47 crc kubenswrapper[5004]: E1203 14:07:47.612699 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.635407 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.647822 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.657970 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.675820 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897e02f6bd7403b74505d5dae4f96cfe01b73d59df9579da24cb95e650c23b6e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"message\\\":\\\"cs-daemon-dgzr8\\\\nI1203 14:07:16.427583 6630 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1203 14:07:16.427584 6630 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-s6kp7 after 0 failed attempt(s)\\\\nI1203 14:07:16.427589 6630 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 14:07:16.427553 6630 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:16Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:16.427596 6630 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:43Z\\\",\\\"message\\\":\\\"7:42.719479 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:42Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:42.719491 6987 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719510 6987 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719521 6987 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1203 14:07:42.718795 6987 services_controller.go:451] Built service openshift-kube-sc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.692635 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.705127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.705181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.705192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.705208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.705219 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.708381 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.720945 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.731221 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.742973 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.755905 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.765713 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.777193 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.786664 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.800225 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.807427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.807468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.807477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.807491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.807499 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.817524 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.867470 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.879596 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.892974 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:47Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.909348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.909404 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.909413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.909426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:47 crc kubenswrapper[5004]: I1203 14:07:47.909435 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:47Z","lastTransitionTime":"2025-12-03T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.011579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.011639 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.011651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.011667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.011678 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.113526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.113610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.113627 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.113650 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.113698 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.216776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.216848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.216903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.216931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.216954 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.320205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.320232 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.320240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.320253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.320263 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.422553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.422605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.422620 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.422640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.422656 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.525380 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.525431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.525446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.525467 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.525482 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.612439 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:48 crc kubenswrapper[5004]: E1203 14:07:48.612572 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.612777 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:48 crc kubenswrapper[5004]: E1203 14:07:48.612843 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.628345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.628373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.628382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.628395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.628405 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.730718 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.730977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.731090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.731162 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.731229 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.833594 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.833632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.833644 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.833660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.833671 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.936304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.936535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.936670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.936773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:48 crc kubenswrapper[5004]: I1203 14:07:48.936869 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:48Z","lastTransitionTime":"2025-12-03T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.039824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.040133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.040221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.040348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.040447 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.143636 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.143699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.143710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.143732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.143746 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.246607 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.247094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.247277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.247405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.247535 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.350577 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.350645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.350657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.350678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.350690 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.438578 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.438992 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:53.438941662 +0000 UTC m=+146.187911898 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.454009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.454078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.454101 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.454135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.454159 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.539113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.539156 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.539176 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.539200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539298 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539298 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539345 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:08:53.539331023 +0000 UTC m=+146.288301259 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539366 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539382 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 14:08:53.539361804 +0000 UTC m=+146.288332100 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539392 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539409 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539445 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 14:08:53.539437136 +0000 UTC m=+146.288407372 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539368 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539476 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539489 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.539525 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 14:08:53.539517199 +0000 UTC m=+146.288487435 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.557627 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.557661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.557670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.557685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.557693 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.612260 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.612330 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.613034 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:49 crc kubenswrapper[5004]: E1203 14:07:49.613211 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.660811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.660879 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.660896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.660920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.660932 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.763816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.763846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.763871 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.763887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.763897 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.866244 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.866293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.866304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.866321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.866334 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.968813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.969068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.969169 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.969253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:49 crc kubenswrapper[5004]: I1203 14:07:49.969391 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:49Z","lastTransitionTime":"2025-12-03T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.013392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.013692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.013756 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.013826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.013916 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.032642 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.036510 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.036565 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.036575 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.036590 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.036600 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.053571 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.056899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.056943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.056952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.056968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.056978 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.072191 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.075600 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.075725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.075758 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.075781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.075797 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.087483 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.090932 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.090962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.090971 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.090986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.090996 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.101809 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1aad4288-4570-4881-9224-58a7361c56c9\\\",\\\"systemUUID\\\":\\\"114765c9-74c4-4d31-b4e3-ce1e142d6291\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:50Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.102059 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.103494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.103518 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.103526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.103539 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.103548 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.206117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.206168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.206186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.206210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.206226 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.309032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.309073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.309085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.309104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.309117 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.411413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.411485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.411507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.411535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.411555 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.514113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.514154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.514166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.514196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.514208 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.612156 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.612272 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.612689 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:50 crc kubenswrapper[5004]: E1203 14:07:50.613068 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.616901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.616968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.617011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.617041 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.617061 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.719814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.719873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.719886 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.719902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.719912 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.822592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.822813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.822948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.823038 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.823106 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.925281 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.925331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.925347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.925368 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:50 crc kubenswrapper[5004]: I1203 14:07:50.925384 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:50Z","lastTransitionTime":"2025-12-03T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.032198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.032527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.032782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.032994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.033173 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.136038 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.136090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.136104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.136122 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.136135 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.238843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.238915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.238927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.238947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.238962 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.296503 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.298308 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:07:51 crc kubenswrapper[5004]: E1203 14:07:51.299050 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.308974 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.326656 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.340166 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.341998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.342135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.342219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.342320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.342398 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.364300 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.381116 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.399324 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.412706 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.428408 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.443476 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.445329 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.445361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.445371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.445408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.445422 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.460221 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.483125 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:43Z\\\",\\\"message\\\":\\\"7:42.719479 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:42Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:42.719491 6987 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719510 6987 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719521 6987 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1203 14:07:42.718795 6987 services_controller.go:451] Built service openshift-kube-sc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.505838 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.519006 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.529213 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.537992 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.548552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.548591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.548601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.548617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.548628 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.552379 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.565486 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.579994 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:51Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.612364 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.612409 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:51 crc kubenswrapper[5004]: E1203 14:07:51.612736 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:51 crc kubenswrapper[5004]: E1203 14:07:51.612932 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.650623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.650770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.650850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.650986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.651074 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.753579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.753641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.753658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.753686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.753703 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.856561 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.856617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.856637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.856665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.856687 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.958660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.958707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.958721 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.958740 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:51 crc kubenswrapper[5004]: I1203 14:07:51.958753 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:51Z","lastTransitionTime":"2025-12-03T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.061836 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.061979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.061997 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.062023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.062039 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.163794 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.163843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.163873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.163894 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.163906 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.266605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.266652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.266671 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.266689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.266699 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.369326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.370673 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.370935 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.371172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.371415 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.474660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.474714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.474728 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.474747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.474760 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.577436 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.577468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.577476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.577490 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.577499 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.612547 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:52 crc kubenswrapper[5004]: E1203 14:07:52.612720 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.612559 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:52 crc kubenswrapper[5004]: E1203 14:07:52.612800 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.679961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.680009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.680031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.680051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.680064 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.782273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.782324 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.782338 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.782355 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.782367 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.884349 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.884412 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.884428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.884448 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.884547 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.987152 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.987208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.987219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.987236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:52 crc kubenswrapper[5004]: I1203 14:07:52.987248 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:52Z","lastTransitionTime":"2025-12-03T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.090001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.090096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.090114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.090137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.090153 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.192931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.192994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.193008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.193028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.193043 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.295379 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.295460 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.295496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.295520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.295538 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.397362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.397397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.397407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.397424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.397434 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.500371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.500407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.500428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.500455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.500471 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.603236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.603295 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.603307 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.603325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.603338 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.612595 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:53 crc kubenswrapper[5004]: E1203 14:07:53.612743 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.613443 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:53 crc kubenswrapper[5004]: E1203 14:07:53.613532 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.624693 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.705521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.705579 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.705618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.705637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.705649 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.808384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.808454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.808472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.808493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.808509 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.912184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.912277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.912294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.912316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:53 crc kubenswrapper[5004]: I1203 14:07:53.912331 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:53Z","lastTransitionTime":"2025-12-03T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.015016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.015072 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.015082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.015097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.015110 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.116707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.116748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.116761 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.116776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.116787 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.219767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.219825 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.219850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.219928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.219953 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.322935 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.322971 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.322983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.322999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.323010 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.424803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.424835 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.424843 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.424878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.424888 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.527252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.527359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.527383 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.527410 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.527430 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.612419 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.612415 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:54 crc kubenswrapper[5004]: E1203 14:07:54.612760 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:54 crc kubenswrapper[5004]: E1203 14:07:54.612950 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.629817 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.629883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.629895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.629912 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.629922 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.733292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.733334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.733349 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.733369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.733385 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.836677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.836726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.836737 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.836754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.836788 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.940597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.940667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.940693 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.940725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:54 crc kubenswrapper[5004]: I1203 14:07:54.940749 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:54Z","lastTransitionTime":"2025-12-03T14:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.044150 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.044220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.044232 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.044252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.044266 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.147217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.147288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.147311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.147342 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.147363 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.250249 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.250336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.250360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.250390 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.250411 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.353442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.353479 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.353514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.353539 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.353563 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.456362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.456431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.456448 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.456466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.456478 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.559490 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.559544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.559556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.559574 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.559586 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.612054 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.612150 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:55 crc kubenswrapper[5004]: E1203 14:07:55.612227 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:55 crc kubenswrapper[5004]: E1203 14:07:55.612388 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.661776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.661808 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.661818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.661832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.661843 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.763561 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.763596 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.763603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.763619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.763627 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.866348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.866455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.866475 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.866499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.866522 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.969714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.969767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.969780 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.969798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:55 crc kubenswrapper[5004]: I1203 14:07:55.969811 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:55Z","lastTransitionTime":"2025-12-03T14:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.072269 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.072318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.072334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.072356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.072372 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.174977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.175019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.175030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.175044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.175053 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.278048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.278270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.278333 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.278630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.278724 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.381626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.381676 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.381687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.381706 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.381717 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.484298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.484369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.484392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.484421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.484443 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.587316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.587571 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.587657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.587771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.587885 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.612017 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:56 crc kubenswrapper[5004]: E1203 14:07:56.612317 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.612117 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:56 crc kubenswrapper[5004]: E1203 14:07:56.612634 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.690408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.690996 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.691029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.691047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.691056 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.795265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.795308 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.795316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.795332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.795342 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.897407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.897458 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.897467 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.897481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:56 crc kubenswrapper[5004]: I1203 14:07:56.897507 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:56Z","lastTransitionTime":"2025-12-03T14:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.000329 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.000380 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.000397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.000419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.000435 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.102447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.102487 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.102498 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.102513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.102525 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.204924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.204985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.204994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.205011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.205020 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.307913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.308028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.308048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.308068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.308082 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.411215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.411264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.411282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.411307 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.411325 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.513986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.514331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.514485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.514626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.514760 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.613687 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:57 crc kubenswrapper[5004]: E1203 14:07:57.613850 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.614160 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:57 crc kubenswrapper[5004]: E1203 14:07:57.614279 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.620542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.620788 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.620990 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.621170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.621308 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.631407 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"106fb4f4-87db-4b57-9ce1-2ef97234bc43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a7026ee371ac3d413422b177420ba50d30ab0a13a9746ca5c170888521c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c5dc42a33f8ba6b3b01724eb127d12bde2c9de56ab8940f0da924daaea2ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c0fb7eee523372b3cd44c2f754da950526c5e36ed06adbe2c5a68cbe5c8e84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0ea23c82561dc6bb4a6cfec6a68a8b21dd0295dee42f2893ab77787d6bb000b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.657097 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51302b123e55467af8b07773f197b20f1f0c279a2147ff68be612ebceabee86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c27a54938bfbbc6afd08c2cae42b5c2f57d5689bd81b0e0a9af0ed7ceb74e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.670200 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.687576 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89170a26-1e46-43b1-a994-94a9879d3cf6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5e9cffddb669b6cb2a9e60ae557a25719dc1b903dbb9bb2066ea2415f65739b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59f6f8aeabbb619e93eaa88a4799e5d267ef2d59869952cbb1832602c24ff214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ff6bm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.700616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ac1a863-821e-4121-89e7-d433127a12ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad046f8be328efdde718a51fc64e5782d31488fb832540d33d7352c51c6e8341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796bcb0af6f5f03ec4aa89bb3dfeb97869dbee6b88d5412a73d18dc288cf628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://796bcb0af6f5f03ec4aa89bb3dfeb97869dbee6b88d5412a73d18dc288cf628b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.718013 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf707ab-47f4-4c1d-81e1-6d8801be836b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b85b4d870d33480aaec89e5a831a0e366a8049ebcea39aca0ba1cfd4fd15f0da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460b0a26998044301e6d47e54367f979f8ebcbeaa7828984933c69ba6d970c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dab9faf48e93c8c18b370f033936dbc7ff6f018e90d172e7fb9c38efaf297bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.724491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.724558 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.724573 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.724591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.724603 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.731677 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e877c2b925b08af98b005b430753d915de0c901bcddf49b5439e42340d36e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.744538 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-z2zlx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f436781-8ddc-4947-a631-d020ec46f63a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d818882802f46e05ec38cc10d2263d3e2abc9caa4fd8a4fcac642701dfcb4313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7pqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-z2zlx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.760168 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s6kp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff08cd56-3e47-4cd7-98ad-8571f178dc62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:34Z\\\",\\\"message\\\":\\\"2025-12-03T14:06:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4\\\\n2025-12-03T14:06:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abbe96c1-a984-4489-b208-ba74538feaa4 to /host/opt/cni/bin/\\\\n2025-12-03T14:06:49Z [verbose] multus-daemon started\\\\n2025-12-03T14:06:49Z [verbose] Readiness Indicator file check\\\\n2025-12-03T14:07:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp5n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s6kp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.774099 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"54394065-8262-4c2e-abdb-c81b096049ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmgt8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:07:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dgzr8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.789216 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5464e9f5-0e39-426f-b9da-d5959948f8fe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 14:06:44.843142 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 14:06:44.843350 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 14:06:44.845005 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1110966446/tls.crt::/tmp/serving-cert-1110966446/tls.key\\\\\\\"\\\\nI1203 14:06:45.358463 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 14:06:45.363491 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 14:06:45.363546 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 14:06:45.363584 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 14:06:45.363592 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 14:06:45.370044 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 14:06:45.370103 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 14:06:45.370115 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 14:06:45.370119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 14:06:45.370123 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 14:06:45.370126 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 14:06:45.370187 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 14:06:45.373155 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.806598 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.820954 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94f1735619b5fd091fe1dcd0a40b78ee0d3ae4de63c38d7808f78615fd27cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.827463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.827965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.828086 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.828183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.828262 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.835725 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f81fc4cb3dc2f38a3908b54c9d6c510d0f6949e7cd447312c9c95312a25bba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tndf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m4g6v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.854368 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mjjss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"180b8370-535b-42de-9d0a-cf5e572c9480\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d02b8fedd1b51d342e716541fa1cc1fc11c673784ba9dfc9ebf3d7cb7dfcc67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f6c6e3cafbb3cc20f779f8f9069e078e6f659465663af2b11646e8d6a0b691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd210096e40bebe8ef88645f2809692f6834af1423c05128164de8b8a79280e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20a7bca218fe4c2f86ce8c5111ed0fc0406e0b35a7ffa6b6f64220f70698a2eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b3596e379d8b4f9fe188880c094f5079b91109655034b91654cc6ee996d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fe86f38eb9649f54e723718467549aa031eb615b2fb60505432e77d4ea2d3d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6112b8f2d3f9c0419f3cbd61335254b9d6aa919c198d7c049c7e412abd6fc6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mjjss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.875103 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac90b9a4-60fd-4d72-99dd-04824ad6f643\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51d3a701a28feb70d9f735300038f0bb18db1e193238bfcdd2214bebe1a40168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14596307bebeb419e16e1d24e42f00278b229151e180c8e344bf927e8dcb431b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dae88e4748216e54f7d24fa5b01ef12c17660f644b3eaeacadc99f1df7ab28b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e28ba8661931c970f96daf9248f3314e58ed9167e8d17408241ef36071a0666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3431b25d604402cad203112512ee130677f3f6db3cfe026a8309c97bbb848d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfdb6a05e62cd71a82f38752af83f654597387aa6007300691bbbf10cf69712c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca0392f49d6e942d35751abaf0e826e1d87684dc757c8899cb7ce376568611\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4802d1bdb35caff9a527bda0ae46b64cc07c0c3e6d6890ee3d93e9cb962307a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.887589 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.898343 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kvvjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5acc6204-5c3a-4d00-9d86-13415fb3f68f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5dbc43a753abf5b0bf06b2cf3c21dbf71111471a59d4ac564425c73cf2519e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsq24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kvvjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.917887 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T14:06:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T14:07:43Z\\\",\\\"message\\\":\\\"7:42.719479 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:42Z is after 2025-08-24T17:21:41Z]\\\\nI1203 14:07:42.719491 6987 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719510 6987 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1203 14:07:42.719521 6987 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1203 14:07:42.718795 6987 services_controller.go:451] Built service openshift-kube-sc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T14:07:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T14:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T14:06:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T14:06:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmzrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T14:06:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvbnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T14:07:57Z is after 2025-08-24T17:21:41Z" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.930518 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.930773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.930881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.930973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:57 crc kubenswrapper[5004]: I1203 14:07:57.931086 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:57Z","lastTransitionTime":"2025-12-03T14:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.032774 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.032816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.032830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.032846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.032870 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.135177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.135221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.135230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.135245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.135289 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.237472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.237521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.237530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.237545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.237554 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.340707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.340774 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.340793 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.340820 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.340838 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.444159 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.444234 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.444259 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.444291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.444316 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.547608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.547686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.547714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.547745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.547769 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.612720 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.612778 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:07:58 crc kubenswrapper[5004]: E1203 14:07:58.612885 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:07:58 crc kubenswrapper[5004]: E1203 14:07:58.613009 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.650456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.650504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.650513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.650528 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.650537 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.753224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.753972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.754008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.754028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.754042 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.856547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.856595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.856604 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.856618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.856627 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.959132 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.959186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.959197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.959215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:58 crc kubenswrapper[5004]: I1203 14:07:58.959227 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:58Z","lastTransitionTime":"2025-12-03T14:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.061217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.061270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.061282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.061300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.061312 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.163941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.163977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.163987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.164002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.164013 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.266964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.267011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.267021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.267035 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.267044 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.369481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.369524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.369536 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.369553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.369564 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.472283 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.472387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.472419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.472446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.472460 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.576131 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.576172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.576182 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.576201 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.576211 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.612610 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.612636 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:07:59 crc kubenswrapper[5004]: E1203 14:07:59.612814 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:07:59 crc kubenswrapper[5004]: E1203 14:07:59.613376 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.678544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.678580 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.678592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.678609 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.678620 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.780914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.780953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.780962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.780978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.780988 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.883668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.883711 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.883720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.883735 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.883747 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.987095 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.987167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.987189 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.987219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:07:59 crc kubenswrapper[5004]: I1203 14:07:59.987242 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:07:59Z","lastTransitionTime":"2025-12-03T14:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.089498 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.089532 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.089541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.089563 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.089580 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:08:00Z","lastTransitionTime":"2025-12-03T14:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.191391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.191449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.191464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.191483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.191499 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:08:00Z","lastTransitionTime":"2025-12-03T14:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.294665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.295229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.295340 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.295427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.295516 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:08:00Z","lastTransitionTime":"2025-12-03T14:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.359900 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.359944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.359956 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.359973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.359984 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T14:08:00Z","lastTransitionTime":"2025-12-03T14:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.418777 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx"] Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.419282 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.421150 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.422684 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.423411 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.423576 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.442110 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.442481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.442688 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.442853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.443124 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.457020 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s6kp7" podStartSLOduration=74.456941484 podStartE2EDuration="1m14.456941484s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.442255879 +0000 UTC m=+93.191226125" watchObservedRunningTime="2025-12-03 14:08:00.456941484 +0000 UTC m=+93.205911720" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.488367 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.488340695 podStartE2EDuration="7.488340695s" podCreationTimestamp="2025-12-03 14:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.470004549 +0000 UTC m=+93.218974815" watchObservedRunningTime="2025-12-03 14:08:00.488340695 +0000 UTC m=+93.237310921" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.500919 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.500893465 podStartE2EDuration="1m15.500893465s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.489473099 +0000 UTC m=+93.238443325" watchObservedRunningTime="2025-12-03 14:08:00.500893465 +0000 UTC m=+93.249863701" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.528933 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mjjss" podStartSLOduration=74.528902154 podStartE2EDuration="1m14.528902154s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.528666257 +0000 UTC m=+93.277636493" watchObservedRunningTime="2025-12-03 14:08:00.528902154 +0000 UTC m=+93.277872390" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.529257 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-z2zlx" podStartSLOduration=75.529250914 podStartE2EDuration="1m15.529250914s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.513459426 +0000 UTC m=+93.262429662" watchObservedRunningTime="2025-12-03 14:08:00.529250914 +0000 UTC m=+93.278221150" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.544526 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.544893 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.545040 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.545193 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.545338 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.544973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.544695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.546405 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.550888 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.554045 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.554028505 podStartE2EDuration="1m15.554028505s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.553555321 +0000 UTC m=+93.302525557" watchObservedRunningTime="2025-12-03 14:08:00.554028505 +0000 UTC m=+93.302998741" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.566991 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ed3984-a4de-4bc4-a89f-70d1ee629c83-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xzgbx\" (UID: \"a0ed3984-a4de-4bc4-a89f-70d1ee629c83\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.606751 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podStartSLOduration=75.606734652 podStartE2EDuration="1m15.606734652s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.606589057 +0000 UTC m=+93.355559303" watchObservedRunningTime="2025-12-03 14:08:00.606734652 +0000 UTC m=+93.355704888" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.612778 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:00 crc kubenswrapper[5004]: E1203 14:08:00.612950 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.613159 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:00 crc kubenswrapper[5004]: E1203 14:08:00.613309 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.650325 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.650307012 podStartE2EDuration="1m14.650307012s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.635275136 +0000 UTC m=+93.384245362" watchObservedRunningTime="2025-12-03 14:08:00.650307012 +0000 UTC m=+93.399277248" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.661265 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kvvjx" podStartSLOduration=75.661241873 podStartE2EDuration="1m15.661241873s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.661070158 +0000 UTC m=+93.410040384" watchObservedRunningTime="2025-12-03 14:08:00.661241873 +0000 UTC m=+93.410212109" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.694878 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.694831811 podStartE2EDuration="44.694831811s" podCreationTimestamp="2025-12-03 14:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.693950734 +0000 UTC m=+93.442920990" watchObservedRunningTime="2025-12-03 14:08:00.694831811 +0000 UTC m=+93.443802047" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.727876 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ff6bm" podStartSLOduration=74.727839171 podStartE2EDuration="1m14.727839171s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:00.727404017 +0000 UTC m=+93.476374253" watchObservedRunningTime="2025-12-03 14:08:00.727839171 +0000 UTC m=+93.476809427" Dec 03 14:08:00 crc kubenswrapper[5004]: I1203 14:08:00.734557 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" Dec 03 14:08:01 crc kubenswrapper[5004]: I1203 14:08:01.139247 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" event={"ID":"a0ed3984-a4de-4bc4-a89f-70d1ee629c83","Type":"ContainerStarted","Data":"74ae08e41b2a9beabe7ec7f703e204b521d63fe669f5d72d38fc9f48e72b5e3f"} Dec 03 14:08:01 crc kubenswrapper[5004]: I1203 14:08:01.139313 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" event={"ID":"a0ed3984-a4de-4bc4-a89f-70d1ee629c83","Type":"ContainerStarted","Data":"32d0617494d605254235e7f388f058cbc96f193aa8c992ae4c2fd0d515e863b4"} Dec 03 14:08:01 crc kubenswrapper[5004]: I1203 14:08:01.154544 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xzgbx" podStartSLOduration=75.154527247 podStartE2EDuration="1m15.154527247s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:01.154246819 +0000 UTC m=+93.903217055" watchObservedRunningTime="2025-12-03 14:08:01.154527247 +0000 UTC m=+93.903497483" Dec 03 14:08:01 crc kubenswrapper[5004]: I1203 14:08:01.614080 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:01 crc kubenswrapper[5004]: E1203 14:08:01.614460 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:01 crc kubenswrapper[5004]: I1203 14:08:01.614611 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:01 crc kubenswrapper[5004]: E1203 14:08:01.616030 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:02 crc kubenswrapper[5004]: I1203 14:08:02.612557 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:02 crc kubenswrapper[5004]: I1203 14:08:02.612606 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:02 crc kubenswrapper[5004]: E1203 14:08:02.612686 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:02 crc kubenswrapper[5004]: E1203 14:08:02.612774 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:03 crc kubenswrapper[5004]: I1203 14:08:03.612241 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:03 crc kubenswrapper[5004]: I1203 14:08:03.612238 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:03 crc kubenswrapper[5004]: E1203 14:08:03.612372 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:03 crc kubenswrapper[5004]: E1203 14:08:03.612527 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:04 crc kubenswrapper[5004]: I1203 14:08:04.385053 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:04 crc kubenswrapper[5004]: E1203 14:08:04.385220 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:08:04 crc kubenswrapper[5004]: E1203 14:08:04.385283 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs podName:54394065-8262-4c2e-abdb-c81b096049ef nodeName:}" failed. No retries permitted until 2025-12-03 14:09:08.38526413 +0000 UTC m=+161.134234376 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs") pod "network-metrics-daemon-dgzr8" (UID: "54394065-8262-4c2e-abdb-c81b096049ef") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 14:08:04 crc kubenswrapper[5004]: I1203 14:08:04.612357 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:04 crc kubenswrapper[5004]: I1203 14:08:04.612459 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:04 crc kubenswrapper[5004]: E1203 14:08:04.612581 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:04 crc kubenswrapper[5004]: E1203 14:08:04.612728 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:05 crc kubenswrapper[5004]: I1203 14:08:05.613113 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:05 crc kubenswrapper[5004]: I1203 14:08:05.613262 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:05 crc kubenswrapper[5004]: E1203 14:08:05.613839 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:05 crc kubenswrapper[5004]: E1203 14:08:05.614767 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:05 crc kubenswrapper[5004]: I1203 14:08:05.615457 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:08:05 crc kubenswrapper[5004]: E1203 14:08:05.615800 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:08:06 crc kubenswrapper[5004]: I1203 14:08:06.612823 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:06 crc kubenswrapper[5004]: I1203 14:08:06.612949 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:06 crc kubenswrapper[5004]: E1203 14:08:06.613397 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:06 crc kubenswrapper[5004]: E1203 14:08:06.613299 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:07 crc kubenswrapper[5004]: I1203 14:08:07.612953 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:07 crc kubenswrapper[5004]: I1203 14:08:07.612964 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:07 crc kubenswrapper[5004]: E1203 14:08:07.614104 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:07 crc kubenswrapper[5004]: E1203 14:08:07.614312 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:08 crc kubenswrapper[5004]: I1203 14:08:08.612118 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:08 crc kubenswrapper[5004]: I1203 14:08:08.612145 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:08 crc kubenswrapper[5004]: E1203 14:08:08.612467 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:08 crc kubenswrapper[5004]: E1203 14:08:08.612544 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:09 crc kubenswrapper[5004]: I1203 14:08:09.612559 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:09 crc kubenswrapper[5004]: I1203 14:08:09.612614 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:09 crc kubenswrapper[5004]: E1203 14:08:09.612695 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:09 crc kubenswrapper[5004]: E1203 14:08:09.612755 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:10 crc kubenswrapper[5004]: I1203 14:08:10.612597 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:10 crc kubenswrapper[5004]: I1203 14:08:10.612640 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:10 crc kubenswrapper[5004]: E1203 14:08:10.612743 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:10 crc kubenswrapper[5004]: E1203 14:08:10.612825 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:11 crc kubenswrapper[5004]: I1203 14:08:11.613058 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:11 crc kubenswrapper[5004]: I1203 14:08:11.613132 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:11 crc kubenswrapper[5004]: E1203 14:08:11.613203 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:11 crc kubenswrapper[5004]: E1203 14:08:11.613357 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:12 crc kubenswrapper[5004]: I1203 14:08:12.612593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:12 crc kubenswrapper[5004]: I1203 14:08:12.612593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:12 crc kubenswrapper[5004]: E1203 14:08:12.612813 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:12 crc kubenswrapper[5004]: E1203 14:08:12.612931 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:13 crc kubenswrapper[5004]: I1203 14:08:13.613033 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:13 crc kubenswrapper[5004]: I1203 14:08:13.613160 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:13 crc kubenswrapper[5004]: E1203 14:08:13.613197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:13 crc kubenswrapper[5004]: E1203 14:08:13.613371 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:14 crc kubenswrapper[5004]: I1203 14:08:14.612740 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:14 crc kubenswrapper[5004]: I1203 14:08:14.612753 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:14 crc kubenswrapper[5004]: E1203 14:08:14.612899 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:14 crc kubenswrapper[5004]: E1203 14:08:14.612955 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:15 crc kubenswrapper[5004]: I1203 14:08:15.612114 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:15 crc kubenswrapper[5004]: I1203 14:08:15.612352 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:15 crc kubenswrapper[5004]: E1203 14:08:15.612478 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:15 crc kubenswrapper[5004]: E1203 14:08:15.612634 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:16 crc kubenswrapper[5004]: I1203 14:08:16.612215 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:16 crc kubenswrapper[5004]: E1203 14:08:16.612442 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:16 crc kubenswrapper[5004]: I1203 14:08:16.612250 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:16 crc kubenswrapper[5004]: E1203 14:08:16.613003 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:17 crc kubenswrapper[5004]: I1203 14:08:17.612128 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:17 crc kubenswrapper[5004]: E1203 14:08:17.614369 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:17 crc kubenswrapper[5004]: I1203 14:08:17.614439 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:17 crc kubenswrapper[5004]: E1203 14:08:17.614538 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:18 crc kubenswrapper[5004]: I1203 14:08:18.612543 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:18 crc kubenswrapper[5004]: I1203 14:08:18.612623 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:18 crc kubenswrapper[5004]: E1203 14:08:18.612799 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:18 crc kubenswrapper[5004]: E1203 14:08:18.612981 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:19 crc kubenswrapper[5004]: I1203 14:08:19.612299 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:19 crc kubenswrapper[5004]: I1203 14:08:19.612887 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:19 crc kubenswrapper[5004]: E1203 14:08:19.613064 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:19 crc kubenswrapper[5004]: E1203 14:08:19.613554 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:19 crc kubenswrapper[5004]: I1203 14:08:19.613908 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:08:19 crc kubenswrapper[5004]: E1203 14:08:19.614122 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvbnm_openshift-ovn-kubernetes(78eea523-e8ee-4f41-93b2-6bbfdcdf3371)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" Dec 03 14:08:20 crc kubenswrapper[5004]: I1203 14:08:20.612486 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:20 crc kubenswrapper[5004]: I1203 14:08:20.612513 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:20 crc kubenswrapper[5004]: E1203 14:08:20.612643 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:20 crc kubenswrapper[5004]: E1203 14:08:20.612725 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.198099 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/1.log" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.198833 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/0.log" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.198975 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff08cd56-3e47-4cd7-98ad-8571f178dc62" containerID="70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a" exitCode=1 Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.199027 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerDied","Data":"70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a"} Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.199107 5004 scope.go:117] "RemoveContainer" containerID="76d7ea4248f3b5291cb8467006f5f9680150e9b718b2995b6f3f70fdb6a26b7e" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.199682 5004 scope.go:117] "RemoveContainer" containerID="70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a" Dec 03 14:08:21 crc kubenswrapper[5004]: E1203 14:08:21.200021 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s6kp7_openshift-multus(ff08cd56-3e47-4cd7-98ad-8571f178dc62)\"" pod="openshift-multus/multus-s6kp7" podUID="ff08cd56-3e47-4cd7-98ad-8571f178dc62" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.612395 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:21 crc kubenswrapper[5004]: E1203 14:08:21.612707 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:21 crc kubenswrapper[5004]: I1203 14:08:21.612433 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:21 crc kubenswrapper[5004]: E1203 14:08:21.613477 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:22 crc kubenswrapper[5004]: I1203 14:08:22.203179 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/1.log" Dec 03 14:08:22 crc kubenswrapper[5004]: I1203 14:08:22.612662 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:22 crc kubenswrapper[5004]: I1203 14:08:22.612690 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:22 crc kubenswrapper[5004]: E1203 14:08:22.613022 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:22 crc kubenswrapper[5004]: E1203 14:08:22.613154 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:23 crc kubenswrapper[5004]: I1203 14:08:23.612772 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:23 crc kubenswrapper[5004]: I1203 14:08:23.612844 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:23 crc kubenswrapper[5004]: E1203 14:08:23.613361 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:23 crc kubenswrapper[5004]: E1203 14:08:23.613453 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:24 crc kubenswrapper[5004]: I1203 14:08:24.612360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:24 crc kubenswrapper[5004]: I1203 14:08:24.612383 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:24 crc kubenswrapper[5004]: E1203 14:08:24.612545 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:24 crc kubenswrapper[5004]: E1203 14:08:24.612645 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:25 crc kubenswrapper[5004]: I1203 14:08:25.612312 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:25 crc kubenswrapper[5004]: I1203 14:08:25.612387 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:25 crc kubenswrapper[5004]: E1203 14:08:25.612448 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:25 crc kubenswrapper[5004]: E1203 14:08:25.612534 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:26 crc kubenswrapper[5004]: I1203 14:08:26.612068 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:26 crc kubenswrapper[5004]: I1203 14:08:26.612101 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:26 crc kubenswrapper[5004]: E1203 14:08:26.612264 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:26 crc kubenswrapper[5004]: E1203 14:08:26.612332 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:27 crc kubenswrapper[5004]: I1203 14:08:27.612153 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:27 crc kubenswrapper[5004]: E1203 14:08:27.613019 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:27 crc kubenswrapper[5004]: I1203 14:08:27.613086 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:27 crc kubenswrapper[5004]: E1203 14:08:27.613243 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:27 crc kubenswrapper[5004]: E1203 14:08:27.622534 5004 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 14:08:27 crc kubenswrapper[5004]: E1203 14:08:27.693695 5004 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:08:28 crc kubenswrapper[5004]: I1203 14:08:28.612767 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:28 crc kubenswrapper[5004]: E1203 14:08:28.613212 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:28 crc kubenswrapper[5004]: I1203 14:08:28.612767 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:28 crc kubenswrapper[5004]: E1203 14:08:28.613292 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:29 crc kubenswrapper[5004]: I1203 14:08:29.612013 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:29 crc kubenswrapper[5004]: E1203 14:08:29.612204 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:29 crc kubenswrapper[5004]: I1203 14:08:29.612362 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:29 crc kubenswrapper[5004]: E1203 14:08:29.612580 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:30 crc kubenswrapper[5004]: I1203 14:08:30.612529 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:30 crc kubenswrapper[5004]: E1203 14:08:30.612665 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:30 crc kubenswrapper[5004]: I1203 14:08:30.612721 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:30 crc kubenswrapper[5004]: E1203 14:08:30.612764 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:31 crc kubenswrapper[5004]: I1203 14:08:31.612752 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:31 crc kubenswrapper[5004]: E1203 14:08:31.612930 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:31 crc kubenswrapper[5004]: I1203 14:08:31.613569 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:31 crc kubenswrapper[5004]: E1203 14:08:31.613693 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:32 crc kubenswrapper[5004]: I1203 14:08:32.612574 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:32 crc kubenswrapper[5004]: E1203 14:08:32.612722 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:32 crc kubenswrapper[5004]: I1203 14:08:32.612982 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:32 crc kubenswrapper[5004]: E1203 14:08:32.613199 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:32 crc kubenswrapper[5004]: E1203 14:08:32.695662 5004 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 14:08:33 crc kubenswrapper[5004]: I1203 14:08:33.612672 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:33 crc kubenswrapper[5004]: I1203 14:08:33.612710 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:33 crc kubenswrapper[5004]: E1203 14:08:33.613077 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:33 crc kubenswrapper[5004]: I1203 14:08:33.613254 5004 scope.go:117] "RemoveContainer" containerID="70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a" Dec 03 14:08:33 crc kubenswrapper[5004]: E1203 14:08:33.613333 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:34 crc kubenswrapper[5004]: I1203 14:08:34.237149 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/1.log" Dec 03 14:08:34 crc kubenswrapper[5004]: I1203 14:08:34.237210 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerStarted","Data":"f6b3217cb0590f575d85bd7a577d90b72df97e280035f1545948ccb27a9febb5"} Dec 03 14:08:34 crc kubenswrapper[5004]: I1203 14:08:34.612551 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:34 crc kubenswrapper[5004]: I1203 14:08:34.612577 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:34 crc kubenswrapper[5004]: E1203 14:08:34.612681 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:34 crc kubenswrapper[5004]: E1203 14:08:34.613064 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:34 crc kubenswrapper[5004]: I1203 14:08:34.613394 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.243625 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/3.log" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.247527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerStarted","Data":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.247988 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.326704 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podStartSLOduration=109.326685224 podStartE2EDuration="1m49.326685224s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:35.276945607 +0000 UTC m=+128.025915833" watchObservedRunningTime="2025-12-03 14:08:35.326685224 +0000 UTC m=+128.075655460" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.327106 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dgzr8"] Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.327200 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:35 crc kubenswrapper[5004]: E1203 14:08:35.327280 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.611993 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:35 crc kubenswrapper[5004]: I1203 14:08:35.612033 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:35 crc kubenswrapper[5004]: E1203 14:08:35.612174 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:35 crc kubenswrapper[5004]: E1203 14:08:35.612268 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:36 crc kubenswrapper[5004]: I1203 14:08:36.612896 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:36 crc kubenswrapper[5004]: E1203 14:08:36.613075 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 14:08:37 crc kubenswrapper[5004]: I1203 14:08:37.612671 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:37 crc kubenswrapper[5004]: I1203 14:08:37.613752 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:37 crc kubenswrapper[5004]: E1203 14:08:37.613742 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 14:08:37 crc kubenswrapper[5004]: I1203 14:08:37.613893 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:37 crc kubenswrapper[5004]: E1203 14:08:37.614008 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 14:08:37 crc kubenswrapper[5004]: E1203 14:08:37.614135 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dgzr8" podUID="54394065-8262-4c2e-abdb-c81b096049ef" Dec 03 14:08:38 crc kubenswrapper[5004]: I1203 14:08:38.612555 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:38 crc kubenswrapper[5004]: I1203 14:08:38.615425 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 14:08:38 crc kubenswrapper[5004]: I1203 14:08:38.615596 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.612657 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.612851 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.612657 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.615507 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.615599 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.616509 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 14:08:39 crc kubenswrapper[5004]: I1203 14:08:39.617365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.366325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.402642 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.403212 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.404358 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.404955 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.405348 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mdbfw"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.406145 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.407559 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d5x7k"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.408316 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.410622 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.411193 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.411680 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.412390 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.424740 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.426308 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.426837 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.428035 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.438755 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.442732 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.462972 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463069 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463299 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463375 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463438 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463624 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463783 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.463937 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464050 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464526 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464729 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464807 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464832 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464890 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464965 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465051 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465087 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465123 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464970 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465214 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.464805 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465538 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465570 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.465908 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.466432 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.466541 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.466620 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.466685 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.467539 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.467795 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.467934 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.469414 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kj2rg"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.469941 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.470022 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.470190 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.470543 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.470781 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.470949 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471030 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471107 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471194 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471268 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471336 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471404 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471475 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471550 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471631 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471730 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.471981 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472031 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472060 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472101 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472156 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472185 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472205 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472257 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bg9k"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472582 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.472976 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.473204 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.473560 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8xqx"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.473971 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.477988 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.478172 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.481625 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.481890 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482197 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482349 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482542 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482635 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482655 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.482767 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.483562 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.484081 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.485072 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.485206 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.485324 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.485889 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.486361 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.488569 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.489611 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ltc99"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.490457 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.491149 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5646w"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.491317 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.492349 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.492379 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.525026 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.525299 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.525547 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526044 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526129 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526264 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526282 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526302 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526400 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.526508 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.527011 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.527194 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.527336 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.527356 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.527465 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.528442 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.528982 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529036 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529293 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529439 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529638 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529730 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.529799 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.530199 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.533470 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.534001 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.534387 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.538175 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.541486 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.541613 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.541646 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.542204 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.542243 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.542306 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.544490 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.546021 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.546283 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.547052 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.547189 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.547362 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.549360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.549679 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.551105 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.551900 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.553070 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.558418 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.558828 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.559315 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.559881 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.560121 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.560249 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.560388 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.560632 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.560981 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.561628 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.562578 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566231 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566276 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cmx\" (UniqueName: \"kubernetes.io/projected/8c89b35e-1cbc-45b2-b90b-ae778d622bb9-kube-api-access-j4cmx\") pod \"downloads-7954f5f757-5646w\" (UID: \"8c89b35e-1cbc-45b2-b90b-ae778d622bb9\") " pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566333 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxcc\" (UniqueName: \"kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566361 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-auth-proxy-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566382 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566408 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566428 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/067d381c-1dc8-40d0-880e-8b1d95cfef3e-machine-approver-tls\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566457 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dbd\" (UniqueName: \"kubernetes.io/projected/067d381c-1dc8-40d0-880e-8b1d95cfef3e-kube-api-access-47dbd\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-default-certificate\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566507 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5sr5\" (UniqueName: \"kubernetes.io/projected/8704c023-6680-4430-a7e7-b4aa5a76d365-kube-api-access-h5sr5\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566528 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-stats-auth\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566574 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5mlg6"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566578 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-metrics-certs\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566726 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8704c023-6680-4430-a7e7-b4aa5a76d365-service-ca-bundle\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.566758 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.567319 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.567317 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.567478 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.567423 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.568080 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.568412 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.568495 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.569207 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwxpp"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.569746 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.571904 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.572533 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.576351 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.576816 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.576980 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.578366 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.583484 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.585556 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.585719 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.585720 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.587050 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vcl7"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.589890 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.595329 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.595359 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.595370 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d5x7k"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.595449 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.599371 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xhhgn"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.600452 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.601458 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kj2rg"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.603925 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.606105 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.607493 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.608672 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.609607 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.610545 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8xqx"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.611752 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.617013 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mdbfw"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.617041 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.617054 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.618059 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bg9k"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.619442 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.620973 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.622425 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.625458 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.625621 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.628102 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.629561 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.630619 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.631602 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.632637 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.633960 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.636501 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mtcwc"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.637443 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.637501 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tnfw8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.639119 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5646w"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.639212 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.640076 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.641139 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.642230 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.643480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.644800 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.646343 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.646951 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5mlg6"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.648012 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xhhgn"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.649017 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tnfw8"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.649883 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwxpp"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.651056 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vcl7"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.652207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.653450 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.654545 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-65k7d"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.655255 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.655582 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65k7d"] Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.665492 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667300 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47dbd\" (UniqueName: \"kubernetes.io/projected/067d381c-1dc8-40d0-880e-8b1d95cfef3e-kube-api-access-47dbd\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667331 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-default-certificate\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667358 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5sr5\" (UniqueName: \"kubernetes.io/projected/8704c023-6680-4430-a7e7-b4aa5a76d365-kube-api-access-h5sr5\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667381 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-stats-auth\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667411 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8704c023-6680-4430-a7e7-b4aa5a76d365-service-ca-bundle\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667430 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-metrics-certs\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667472 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667520 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cmx\" (UniqueName: \"kubernetes.io/projected/8c89b35e-1cbc-45b2-b90b-ae778d622bb9-kube-api-access-j4cmx\") pod \"downloads-7954f5f757-5646w\" (UID: \"8c89b35e-1cbc-45b2-b90b-ae778d622bb9\") " pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667588 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxcc\" (UniqueName: \"kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667609 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-auth-proxy-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667639 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667663 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.667686 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/067d381c-1dc8-40d0-880e-8b1d95cfef3e-machine-approver-tls\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.673589 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.674071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.674361 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-auth-proxy-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.675367 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-default-certificate\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.675531 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067d381c-1dc8-40d0-880e-8b1d95cfef3e-config\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.676093 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-stats-auth\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.676469 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/067d381c-1dc8-40d0-880e-8b1d95cfef3e-machine-approver-tls\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.684869 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.686040 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.692973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704c023-6680-4430-a7e7-b4aa5a76d365-metrics-certs\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.705178 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.708836 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8704c023-6680-4430-a7e7-b4aa5a76d365-service-ca-bundle\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.725386 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.746232 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.764729 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.785843 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.805626 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.825754 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.857133 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.867300 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.893675 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.906204 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.925609 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.945698 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 14:08:41 crc kubenswrapper[5004]: I1203 14:08:41.985984 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.006376 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.025956 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.046017 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.065333 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.086007 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.109633 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.129802 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.148163 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.167734 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.185627 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.204693 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.225936 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.245644 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.265479 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.285793 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.305965 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.325566 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.344764 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.365414 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.386897 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.405648 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.425838 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.446375 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.467202 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.487892 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.506275 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.526032 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.545273 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.565397 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.584576 5004 request.go:700] Waited for 1.015977923s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.586468 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.605643 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.625625 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.646672 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.673333 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.686115 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.706203 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.725906 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.746077 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.766054 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.786925 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.806091 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.825980 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.846925 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.866434 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.886197 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.905623 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.925984 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.946317 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.965793 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:08:42 crc kubenswrapper[5004]: I1203 14:08:42.985811 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.006710 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.026620 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.045882 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.065814 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.086244 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.104979 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.125645 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.145910 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.166114 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.186353 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.206167 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.226045 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.246282 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.265283 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.305329 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.325672 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.346293 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.367011 5004 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386468 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386512 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-images\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386532 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-config\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-serving-cert\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386562 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-service-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386581 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtmng\" (UniqueName: \"kubernetes.io/projected/eaf11e19-667b-4d30-b6fa-71af6a5a1182-kube-api-access-wtmng\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386596 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386611 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srhg\" (UniqueName: \"kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386843 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386833 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8m2q\" (UniqueName: \"kubernetes.io/projected/e52c9fcc-c539-4037-9bae-810fecabe628-kube-api-access-h8m2q\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386906 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386928 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dp4\" (UniqueName: \"kubernetes.io/projected/dfbd3e28-71fd-4412-8218-9ca072542838-kube-api-access-72dp4\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386949 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxvdx\" (UniqueName: \"kubernetes.io/projected/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-kube-api-access-sxvdx\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.386992 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-config\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387011 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387037 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387060 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-image-import-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387082 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387118 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-encryption-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387136 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387157 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhq9c\" (UniqueName: \"kubernetes.io/projected/ae44e9f6-1abb-4d46-9605-4c51579c6933-kube-api-access-qhq9c\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387175 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-client\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387216 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-serving-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-encryption-config\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387256 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9gd\" (UniqueName: \"kubernetes.io/projected/a887d450-ffa8-4b30-98db-2e223c46b134-kube-api-access-nr9gd\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387400 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-client\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387523 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387552 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387587 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387627 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-config\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387643 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcszw\" (UniqueName: \"kubernetes.io/projected/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-kube-api-access-hcszw\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387687 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-node-pullsecrets\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387716 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-trusted-ca\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387731 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-policies\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387807 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387844 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea4dcd98-ce38-4c3e-93d2-9d714f509954-serving-cert\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387909 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387945 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-client\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.387989 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45ef6725-be2e-4fac-8158-4322a766ac08-metrics-tls\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388109 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-serving-cert\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388132 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388250 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388303 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxtd\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-kube-api-access-dcxtd\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388697 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52c9fcc-c539-4037-9bae-810fecabe628-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388722 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkd4\" (UniqueName: \"kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388744 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-serving-cert\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388767 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae44e9f6-1abb-4d46-9605-4c51579c6933-serving-cert\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-config\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388831 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388885 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-dir\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388925 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388957 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.388988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389019 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389037 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389075 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ef6725-be2e-4fac-8158-4322a766ac08-trusted-ca\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389098 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-config\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389156 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389181 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389228 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52c9fcc-c539-4037-9bae-810fecabe628-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389369 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389457 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389598 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389733 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.389834 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:43.889772872 +0000 UTC m=+136.638743128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.389966 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390061 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtw5\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390092 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390129 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfbd3e28-71fd-4412-8218-9ca072542838-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390192 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390232 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jzv\" (UniqueName: \"kubernetes.io/projected/ea4dcd98-ce38-4c3e-93d2-9d714f509954-kube-api-access-d6jzv\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390353 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit-dir\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390524 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a887d450-ffa8-4b30-98db-2e223c46b134-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390544 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.390566 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.406750 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.425204 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.447492 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.465914 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491230 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491475 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6zr\" (UniqueName: \"kubernetes.io/projected/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-kube-api-access-lg6zr\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.491608 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:43.991513718 +0000 UTC m=+136.740483984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491652 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-srv-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491696 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae44e9f6-1abb-4d46-9605-4c51579c6933-serving-cert\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491756 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199db3af-ca8b-4ae4-8adf-46a0facb2d55-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491819 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-cabundle\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-dir\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.491984 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-dir\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492030 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72cc10e2-06e7-4827-b787-3a3d9c2566a5-proxy-tls\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492124 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492212 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492288 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ef6725-be2e-4fac-8158-4322a766ac08-trusted-ca\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492384 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-webhook-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492631 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6g8z\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-kube-api-access-f6g8z\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492756 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492802 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-plugins-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492847 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492924 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.492968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrxw\" (UniqueName: \"kubernetes.io/projected/af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8-kube-api-access-ltrxw\") pod \"migrator-59844c95c7-2g5c8\" (UID: \"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493054 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493086 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493125 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493155 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtw5\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493189 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493221 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/958edf90-36c2-4be7-b1fc-b35607b151e4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493801 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c452a836-d7b1-45d5-b07e-715591179f58-serving-cert\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493835 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452a836-d7b1-45d5-b07e-715591179f58-config\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.493897 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494047 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jzv\" (UniqueName: \"kubernetes.io/projected/ea4dcd98-ce38-4c3e-93d2-9d714f509954-kube-api-access-d6jzv\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjxs\" (UniqueName: \"kubernetes.io/projected/4035ba65-b9bf-4363-96b4-ee3bcfd55988-kube-api-access-8kjxs\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494172 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png7j\" (UniqueName: \"kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494244 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-certs\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494272 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-tmpfs\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494311 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494345 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494380 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ef6725-be2e-4fac-8158-4322a766ac08-trusted-ca\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494403 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494412 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494536 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-config\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494566 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-serving-cert\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494594 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c221c29b-d053-4b03-a758-ff1f5fada663-metrics-tls\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494623 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6srhg\" (UniqueName: \"kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494647 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494668 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8m2q\" (UniqueName: \"kubernetes.io/projected/e52c9fcc-c539-4037-9bae-810fecabe628-kube-api-access-h8m2q\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494696 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72dp4\" (UniqueName: \"kubernetes.io/projected/dfbd3e28-71fd-4412-8218-9ca072542838-kube-api-access-72dp4\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494726 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-key\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494748 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494908 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.494948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psg48\" (UniqueName: \"kubernetes.io/projected/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-kube-api-access-psg48\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495091 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495125 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-image-import-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495172 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhq9c\" (UniqueName: \"kubernetes.io/projected/ae44e9f6-1abb-4d46-9605-4c51579c6933-kube-api-access-qhq9c\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495250 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9gd\" (UniqueName: \"kubernetes.io/projected/a887d450-ffa8-4b30-98db-2e223c46b134-kube-api-access-nr9gd\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495289 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495450 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-config\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495781 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495892 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-srv-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.495998 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-client\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496113 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199db3af-ca8b-4ae4-8adf-46a0facb2d55-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496330 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496455 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496576 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496710 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-csi-data-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496844 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-node-pullsecrets\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.496984 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4035ba65-b9bf-4363-96b4-ee3bcfd55988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497055 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497069 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-image-import-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de87900-83eb-4764-b478-959ab83fb572-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497195 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-trusted-ca\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497230 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4168b6-1d83-48c4-95f1-88b04d773564-cert\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497230 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497247 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwdf\" (UniqueName: \"kubernetes.io/projected/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-kube-api-access-6gwdf\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497479 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.497798 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498024 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498121 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-client\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498456 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-serving-cert\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498504 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.498609 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:43.998592546 +0000 UTC m=+136.747562782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498686 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-node-pullsecrets\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498771 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.498931 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45ef6725-be2e-4fac-8158-4322a766ac08-metrics-tls\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.499047 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.500375 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-trusted-ca\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.500620 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9kn\" (UniqueName: \"kubernetes.io/projected/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-kube-api-access-th9kn\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.500758 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.500816 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.500855 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6defa455-47dd-4d1f-a77d-a3a4617df1b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501258 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4gz\" (UniqueName: \"kubernetes.io/projected/6defa455-47dd-4d1f-a77d-a3a4617df1b8-kube-api-access-tk4gz\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926e2906-448f-4006-a186-2b45932f51e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501538 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-serving-cert\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501641 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926e2906-448f-4006-a186-2b45932f51e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501850 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.501977 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxtd\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-kube-api-access-dcxtd\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502116 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svkd4\" (UniqueName: \"kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502225 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958edf90-36c2-4be7-b1fc-b35607b151e4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502339 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dbd\" (UniqueName: \"kubernetes.io/projected/067d381c-1dc8-40d0-880e-8b1d95cfef3e-kube-api-access-47dbd\") pod \"machine-approver-56656f9798-k6knf\" (UID: \"067d381c-1dc8-40d0-880e-8b1d95cfef3e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502347 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52c9fcc-c539-4037-9bae-810fecabe628-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502426 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c472p\" (UniqueName: \"kubernetes.io/projected/bcb95056-cffc-433a-a3a7-17ad434cf41f-kube-api-access-c472p\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvht9\" (UniqueName: \"kubernetes.io/projected/585f1543-d912-4485-b645-3c818242f920-kube-api-access-qvht9\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhlg\" (UniqueName: \"kubernetes.io/projected/f7098953-20ce-4f6d-a04e-c79d2811ecd6-kube-api-access-gmhlg\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502503 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9de87900-83eb-4764-b478-959ab83fb572-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502558 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-serving-cert\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502582 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502609 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-config\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502633 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxht4\" (UniqueName: \"kubernetes.io/projected/cd4168b6-1d83-48c4-95f1-88b04d773564-kube-api-access-qxht4\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502656 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84639727-6aec-44af-a590-fc6f6a11ba3d-config\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502683 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502709 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77632d37-a94f-4bc0-a07c-7880d70c7d5f-metrics-tls\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502715 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-client\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502732 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b676595-3778-4703-a0b1-654d54d007fc-proxy-tls\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502760 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502788 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502813 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-config\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502841 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/958edf90-36c2-4be7-b1fc-b35607b151e4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502899 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52c9fcc-c539-4037-9bae-810fecabe628-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502937 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502966 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sgjh\" (UniqueName: \"kubernetes.io/projected/77632d37-a94f-4bc0-a07c-7880d70c7d5f-kube-api-access-8sgjh\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502993 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69l9d\" (UniqueName: \"kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503016 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503043 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503068 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfbd3e28-71fd-4412-8218-9ca072542838-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503094 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-client\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503126 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503152 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9ww\" (UniqueName: \"kubernetes.io/projected/926e2906-448f-4006-a186-2b45932f51e6-kube-api-access-nh9ww\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503177 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m5q4\" (UniqueName: \"kubernetes.io/projected/5b676595-3778-4703-a0b1-654d54d007fc-kube-api-access-6m5q4\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit-dir\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a887d450-ffa8-4b30-98db-2e223c46b134-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503307 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-images\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503390 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-service-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503423 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtmng\" (UniqueName: \"kubernetes.io/projected/eaf11e19-667b-4d30-b6fa-71af6a5a1182-kube-api-access-wtmng\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503459 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5k2\" (UniqueName: \"kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503498 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503540 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99pt\" (UniqueName: \"kubernetes.io/projected/c452a836-d7b1-45d5-b07e-715591179f58-kube-api-access-k99pt\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503573 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-node-bootstrap-token\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503606 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxvdx\" (UniqueName: \"kubernetes.io/projected/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-kube-api-access-sxvdx\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-config\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503668 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-images\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503703 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9rz\" (UniqueName: \"kubernetes.io/projected/199db3af-ca8b-4ae4-8adf-46a0facb2d55-kube-api-access-pc9rz\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503796 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503820 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.504077 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-config\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.503829 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-client\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.504153 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.504189 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.502035 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.504759 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.504950 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.505071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/45ef6725-be2e-4fac-8158-4322a766ac08-metrics-tls\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.505502 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.505623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a887d450-ffa8-4b30-98db-2e223c46b134-images\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.505644 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-config\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.505912 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.506067 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaf11e19-667b-4d30-b6fa-71af6a5a1182-etcd-service-ca\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.506235 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.506824 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.507633 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.507672 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-audit-dir\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.508348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52c9fcc-c539-4037-9bae-810fecabe628-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.508824 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.508842 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-etcd-client\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.508991 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e52c9fcc-c539-4037-9bae-810fecabe628-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510191 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510529 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae44e9f6-1abb-4d46-9605-4c51579c6933-serving-cert\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510667 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-socket-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-encryption-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510758 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-registration-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.510781 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-encryption-config\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511056 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-serving-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511552 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4dcd98-ce38-4c3e-93d2-9d714f509954-config\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511544 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511877 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511944 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-etcd-serving-ca\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.511948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512024 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84639727-6aec-44af-a590-fc6f6a11ba3d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae44e9f6-1abb-4d46-9605-4c51579c6933-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512302 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-config\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512337 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcszw\" (UniqueName: \"kubernetes.io/projected/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-kube-api-access-hcszw\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512506 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-profile-collector-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512580 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-policies\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqj9m\" (UniqueName: \"kubernetes.io/projected/c221c29b-d053-4b03-a758-ff1f5fada663-kube-api-access-xqj9m\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512670 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnmd\" (UniqueName: \"kubernetes.io/projected/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-kube-api-access-4wnmd\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea4dcd98-ce38-4c3e-93d2-9d714f509954-serving-cert\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512762 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xjr\" (UniqueName: \"kubernetes.io/projected/72cc10e2-06e7-4827-b787-3a3d9c2566a5-kube-api-access-52xjr\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512786 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84639727-6aec-44af-a590-fc6f6a11ba3d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512825 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512847 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c221c29b-d053-4b03-a758-ff1f5fada663-config-volume\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.512914 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72cc10e2-06e7-4827-b787-3a3d9c2566a5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.513398 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.513509 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-audit-policies\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.513881 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-config\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.514333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-encryption-config\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.514405 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfbd3e28-71fd-4412-8218-9ca072542838-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.514441 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.514585 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.515556 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.516197 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea4dcd98-ce38-4c3e-93d2-9d714f509954-serving-cert\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.516236 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a887d450-ffa8-4b30-98db-2e223c46b134-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.516372 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf11e19-667b-4d30-b6fa-71af6a5a1182-serving-cert\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.516532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.516822 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-encryption-config\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.517232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.519329 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-serving-cert\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.526160 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5sr5\" (UniqueName: \"kubernetes.io/projected/8704c023-6680-4430-a7e7-b4aa5a76d365-kube-api-access-h5sr5\") pod \"router-default-5444994796-ltc99\" (UID: \"8704c023-6680-4430-a7e7-b4aa5a76d365\") " pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.541394 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxcc\" (UniqueName: \"kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc\") pod \"route-controller-manager-6576b87f9c-mjz2z\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.560395 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cmx\" (UniqueName: \"kubernetes.io/projected/8c89b35e-1cbc-45b2-b90b-ae778d622bb9-kube-api-access-j4cmx\") pod \"downloads-7954f5f757-5646w\" (UID: \"8c89b35e-1cbc-45b2-b90b-ae778d622bb9\") " pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.585889 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.599562 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jzv\" (UniqueName: \"kubernetes.io/projected/ea4dcd98-ce38-4c3e-93d2-9d714f509954-kube-api-access-d6jzv\") pod \"console-operator-58897d9998-kj2rg\" (UID: \"ea4dcd98-ce38-4c3e-93d2-9d714f509954\") " pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.618123 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.619197 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.619532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6zr\" (UniqueName: \"kubernetes.io/projected/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-kube-api-access-lg6zr\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.619709 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.119351051 +0000 UTC m=+136.868321287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.619803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199db3af-ca8b-4ae4-8adf-46a0facb2d55-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620078 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-srv-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620132 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-cabundle\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620164 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72cc10e2-06e7-4827-b787-3a3d9c2566a5-proxy-tls\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-webhook-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620232 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6g8z\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-kube-api-access-f6g8z\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620303 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-plugins-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620335 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620363 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrxw\" (UniqueName: \"kubernetes.io/projected/af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8-kube-api-access-ltrxw\") pod \"migrator-59844c95c7-2g5c8\" (UID: \"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620405 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620433 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/958edf90-36c2-4be7-b1fc-b35607b151e4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620495 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c452a836-d7b1-45d5-b07e-715591179f58-serving-cert\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620520 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452a836-d7b1-45d5-b07e-715591179f58-config\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620528 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8m2q\" (UniqueName: \"kubernetes.io/projected/e52c9fcc-c539-4037-9bae-810fecabe628-kube-api-access-h8m2q\") pod \"openshift-apiserver-operator-796bbdcf4f-wm25j\" (UID: \"e52c9fcc-c539-4037-9bae-810fecabe628\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-certs\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620571 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-tmpfs\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620599 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjxs\" (UniqueName: \"kubernetes.io/projected/4035ba65-b9bf-4363-96b4-ee3bcfd55988-kube-api-access-8kjxs\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620626 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620651 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png7j\" (UniqueName: \"kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620679 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620705 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c221c29b-d053-4b03-a758-ff1f5fada663-metrics-tls\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620731 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620757 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620815 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-key\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620875 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620916 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psg48\" (UniqueName: \"kubernetes.io/projected/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-kube-api-access-psg48\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.620985 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621018 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-srv-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621052 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199db3af-ca8b-4ae4-8adf-46a0facb2d55-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621082 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621106 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-csi-data-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621143 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4035ba65-b9bf-4363-96b4-ee3bcfd55988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621165 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de87900-83eb-4764-b478-959ab83fb572-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621220 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4168b6-1d83-48c4-95f1-88b04d773564-cert\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621244 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwdf\" (UniqueName: \"kubernetes.io/projected/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-kube-api-access-6gwdf\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621297 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6defa455-47dd-4d1f-a77d-a3a4617df1b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621322 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4gz\" (UniqueName: \"kubernetes.io/projected/6defa455-47dd-4d1f-a77d-a3a4617df1b8-kube-api-access-tk4gz\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621346 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9kn\" (UniqueName: \"kubernetes.io/projected/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-kube-api-access-th9kn\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621368 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926e2906-448f-4006-a186-2b45932f51e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621389 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926e2906-448f-4006-a186-2b45932f51e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621440 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958edf90-36c2-4be7-b1fc-b35607b151e4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9de87900-83eb-4764-b478-959ab83fb572-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621484 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c472p\" (UniqueName: \"kubernetes.io/projected/bcb95056-cffc-433a-a3a7-17ad434cf41f-kube-api-access-c472p\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvht9\" (UniqueName: \"kubernetes.io/projected/585f1543-d912-4485-b645-3c818242f920-kube-api-access-qvht9\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621529 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhlg\" (UniqueName: \"kubernetes.io/projected/f7098953-20ce-4f6d-a04e-c79d2811ecd6-kube-api-access-gmhlg\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621556 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxht4\" (UniqueName: \"kubernetes.io/projected/cd4168b6-1d83-48c4-95f1-88b04d773564-kube-api-access-qxht4\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621581 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84639727-6aec-44af-a590-fc6f6a11ba3d-config\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621605 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77632d37-a94f-4bc0-a07c-7880d70c7d5f-metrics-tls\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621625 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b676595-3778-4703-a0b1-654d54d007fc-proxy-tls\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621647 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/958edf90-36c2-4be7-b1fc-b35607b151e4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621674 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sgjh\" (UniqueName: \"kubernetes.io/projected/77632d37-a94f-4bc0-a07c-7880d70c7d5f-kube-api-access-8sgjh\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621698 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69l9d\" (UniqueName: \"kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621764 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621795 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9ww\" (UniqueName: \"kubernetes.io/projected/926e2906-448f-4006-a186-2b45932f51e6-kube-api-access-nh9ww\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m5q4\" (UniqueName: \"kubernetes.io/projected/5b676595-3778-4703-a0b1-654d54d007fc-kube-api-access-6m5q4\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621885 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5k2\" (UniqueName: \"kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621917 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k99pt\" (UniqueName: \"kubernetes.io/projected/c452a836-d7b1-45d5-b07e-715591179f58-kube-api-access-k99pt\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621938 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-node-bootstrap-token\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9rz\" (UniqueName: \"kubernetes.io/projected/199db3af-ca8b-4ae4-8adf-46a0facb2d55-kube-api-access-pc9rz\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.621986 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-images\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622011 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622034 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-socket-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622079 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-registration-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622130 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84639727-6aec-44af-a590-fc6f6a11ba3d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622177 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-profile-collector-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622201 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqj9m\" (UniqueName: \"kubernetes.io/projected/c221c29b-d053-4b03-a758-ff1f5fada663-kube-api-access-xqj9m\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622223 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnmd\" (UniqueName: \"kubernetes.io/projected/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-kube-api-access-4wnmd\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622245 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xjr\" (UniqueName: \"kubernetes.io/projected/72cc10e2-06e7-4827-b787-3a3d9c2566a5-kube-api-access-52xjr\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622267 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84639727-6aec-44af-a590-fc6f6a11ba3d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622312 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c221c29b-d053-4b03-a758-ff1f5fada663-config-volume\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.622338 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72cc10e2-06e7-4827-b787-3a3d9c2566a5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.624343 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-cabundle\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.624630 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.624899 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.124884123 +0000 UTC m=+136.873854359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.625449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-plugins-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.625773 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.625988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-tmpfs\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.626892 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199db3af-ca8b-4ae4-8adf-46a0facb2d55-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.627080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c452a836-d7b1-45d5-b07e-715591179f58-config\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.627684 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.628253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199db3af-ca8b-4ae4-8adf-46a0facb2d55-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.628600 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.629732 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.630090 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.630184 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.630385 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926e2906-448f-4006-a186-2b45932f51e6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.630506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72cc10e2-06e7-4827-b787-3a3d9c2566a5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.630681 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.631063 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-webhook-cert\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.631549 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958edf90-36c2-4be7-b1fc-b35607b151e4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.631994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-certs\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.632237 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-csi-data-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.632847 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-srv-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.633496 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84639727-6aec-44af-a590-fc6f6a11ba3d-config\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.634241 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-socket-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.634388 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5b676595-3778-4703-a0b1-654d54d007fc-images\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.634460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-registration-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.635512 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.635700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-srv-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.636018 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f7098953-20ce-4f6d-a04e-c79d2811ecd6-mountpoint-dir\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.636734 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c221c29b-d053-4b03-a758-ff1f5fada663-config-volume\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.636763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926e2906-448f-4006-a186-2b45932f51e6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.636795 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de87900-83eb-4764-b478-959ab83fb572-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.637232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c452a836-d7b1-45d5-b07e-715591179f58-serving-cert\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.637353 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-signing-key\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.637711 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bcb95056-cffc-433a-a3a7-17ad434cf41f-profile-collector-cert\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.638152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4035ba65-b9bf-4363-96b4-ee3bcfd55988-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.638301 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.638508 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.638755 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.639253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72cc10e2-06e7-4827-b787-3a3d9c2566a5-proxy-tls\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.639288 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c221c29b-d053-4b03-a758-ff1f5fada663-metrics-tls\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.639716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/958edf90-36c2-4be7-b1fc-b35607b151e4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640076 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640252 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b676595-3778-4703-a0b1-654d54d007fc-proxy-tls\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640382 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/585f1543-d912-4485-b645-3c818242f920-node-bootstrap-token\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640758 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/77632d37-a94f-4bc0-a07c-7880d70c7d5f-metrics-tls\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640813 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9de87900-83eb-4764-b478-959ab83fb572-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.640929 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.641164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6defa455-47dd-4d1f-a77d-a3a4617df1b8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.641459 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd4168b6-1d83-48c4-95f1-88b04d773564-cert\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.642513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84639727-6aec-44af-a590-fc6f6a11ba3d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.643015 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.643921 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9gd\" (UniqueName: \"kubernetes.io/projected/a887d450-ffa8-4b30-98db-2e223c46b134-kube-api-access-nr9gd\") pod \"machine-api-operator-5694c8668f-mdbfw\" (UID: \"a887d450-ffa8-4b30-98db-2e223c46b134\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.647350 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.666516 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cbc1305-4f62-4db0-85ac-47bf78c2ae85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sz8cd\" (UID: \"9cbc1305-4f62-4db0-85ac-47bf78c2ae85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.678055 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhq9c\" (UniqueName: \"kubernetes.io/projected/ae44e9f6-1abb-4d46-9605-4c51579c6933-kube-api-access-qhq9c\") pod \"authentication-operator-69f744f599-7bg9k\" (UID: \"ae44e9f6-1abb-4d46-9605-4c51579c6933\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.696597 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.701436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dp4\" (UniqueName: \"kubernetes.io/projected/dfbd3e28-71fd-4412-8218-9ca072542838-kube-api-access-72dp4\") pod \"cluster-samples-operator-665b6dd947-6qgs2\" (UID: \"dfbd3e28-71fd-4412-8218-9ca072542838\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.705516 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.715062 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.720163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srhg\" (UniqueName: \"kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg\") pod \"controller-manager-879f6c89f-v4v6z\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.724963 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.725593 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.725780 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.225757904 +0000 UTC m=+136.974728140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.728509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.729120 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.229094082 +0000 UTC m=+136.978064318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.748303 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkd4\" (UniqueName: \"kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4\") pod \"oauth-openshift-558db77b4-vxwzk\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.750341 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.768124 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.775439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxvdx\" (UniqueName: \"kubernetes.io/projected/d598385d-7b3b-4ac4-be9d-8523a0a14bd0-kube-api-access-sxvdx\") pod \"apiserver-76f77b778f-d5x7k\" (UID: \"d598385d-7b3b-4ac4-be9d-8523a0a14bd0\") " pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.789113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtw5\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.799007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.808092 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.836569 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.836985 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.336961769 +0000 UTC m=+137.085932005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.836898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.837950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.838362 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.338350369 +0000 UTC m=+137.087320605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.844269 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxtd\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-kube-api-access-dcxtd\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.847212 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.851690 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ef6725-be2e-4fac-8158-4322a766ac08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c45j8\" (UID: \"45ef6725-be2e-4fac-8158-4322a766ac08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.862706 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtmng\" (UniqueName: \"kubernetes.io/projected/eaf11e19-667b-4d30-b6fa-71af6a5a1182-kube-api-access-wtmng\") pod \"etcd-operator-b45778765-q8xqx\" (UID: \"eaf11e19-667b-4d30-b6fa-71af6a5a1182\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.872717 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.904508 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcszw\" (UniqueName: \"kubernetes.io/projected/6d600ee4-d7d6-4478-bf62-1383c0f9b35c-kube-api-access-hcszw\") pod \"apiserver-7bbb656c7d-jgs55\" (UID: \"6d600ee4-d7d6-4478-bf62-1383c0f9b35c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.929223 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.930177 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.939099 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.939625 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.439580211 +0000 UTC m=+137.188550447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.939970 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:43 crc kubenswrapper[5004]: E1203 14:08:43.940408 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.440380034 +0000 UTC m=+137.189350420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.953635 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6zr\" (UniqueName: \"kubernetes.io/projected/ff0b812e-08c2-4e3f-bb8e-e7bc314e7533-kube-api-access-lg6zr\") pod \"olm-operator-6b444d44fb-knk6r\" (UID: \"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.967908 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6g8z\" (UniqueName: \"kubernetes.io/projected/9de87900-83eb-4764-b478-959ab83fb572-kube-api-access-f6g8z\") pod \"cluster-image-registry-operator-dc59b4c8b-lcc4c\" (UID: \"9de87900-83eb-4764-b478-959ab83fb572\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.976365 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.977142 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.984139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png7j\" (UniqueName: \"kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j\") pod \"console-f9d7485db-ll8wz\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:43 crc kubenswrapper[5004]: I1203 14:08:43.992135 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.008183 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/958edf90-36c2-4be7-b1fc-b35607b151e4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-h5gl4\" (UID: \"958edf90-36c2-4be7-b1fc-b35607b151e4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.023631 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psg48\" (UniqueName: \"kubernetes.io/projected/f986649e-61c8-4c67-beb3-edc5dc4e4fd9-kube-api-access-psg48\") pod \"control-plane-machine-set-operator-78cbb6b69f-swmpz\" (UID: \"f986649e-61c8-4c67-beb3-edc5dc4e4fd9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.042761 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.043600 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.044055 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.544038697 +0000 UTC m=+137.293008933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.059729 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrxw\" (UniqueName: \"kubernetes.io/projected/af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8-kube-api-access-ltrxw\") pod \"migrator-59844c95c7-2g5c8\" (UID: \"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.099964 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sgjh\" (UniqueName: \"kubernetes.io/projected/77632d37-a94f-4bc0-a07c-7880d70c7d5f-kube-api-access-8sgjh\") pod \"dns-operator-744455d44c-9vcl7\" (UID: \"77632d37-a94f-4bc0-a07c-7880d70c7d5f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.112128 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.125788 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69l9d\" (UniqueName: \"kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d\") pod \"collect-profiles-29412840-v4lg7\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.136699 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.139935 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.143393 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.145457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.145866 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.645817315 +0000 UTC m=+137.394787551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.146332 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m5q4\" (UniqueName: \"kubernetes.io/projected/5b676595-3778-4703-a0b1-654d54d007fc-kube-api-access-6m5q4\") pod \"machine-config-operator-74547568cd-6pqqp\" (UID: \"5b676595-3778-4703-a0b1-654d54d007fc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.174852 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.180731 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9ww\" (UniqueName: \"kubernetes.io/projected/926e2906-448f-4006-a186-2b45932f51e6-kube-api-access-nh9ww\") pod \"kube-storage-version-migrator-operator-b67b599dd-6vs6f\" (UID: \"926e2906-448f-4006-a186-2b45932f51e6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.198795 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.226789 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.243874 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99pt\" (UniqueName: \"kubernetes.io/projected/c452a836-d7b1-45d5-b07e-715591179f58-kube-api-access-k99pt\") pod \"service-ca-operator-777779d784-cvx5j\" (UID: \"c452a836-d7b1-45d5-b07e-715591179f58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.245616 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84639727-6aec-44af-a590-fc6f6a11ba3d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lbc4g\" (UID: \"84639727-6aec-44af-a590-fc6f6a11ba3d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.246598 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwdf\" (UniqueName: \"kubernetes.io/projected/46f500f6-07e9-4242-9d25-31a3fc4e5a6d-kube-api-access-6gwdf\") pod \"service-ca-9c57cc56f-rwxpp\" (UID: \"46f500f6-07e9-4242-9d25-31a3fc4e5a6d\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.246967 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.247293 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.747281142 +0000 UTC m=+137.496251378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.259449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c472p\" (UniqueName: \"kubernetes.io/projected/bcb95056-cffc-433a-a3a7-17ad434cf41f-kube-api-access-c472p\") pod \"catalog-operator-68c6474976-whn2l\" (UID: \"bcb95056-cffc-433a-a3a7-17ad434cf41f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.290816 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.291440 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.298140 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ltc99" event={"ID":"8704c023-6680-4430-a7e7-b4aa5a76d365","Type":"ContainerStarted","Data":"2cba8d19a33eac9b833c90e5dac073c7982e6ea995f6efff2f0566bdc38b8763"} Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.300679 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" event={"ID":"067d381c-1dc8-40d0-880e-8b1d95cfef3e","Type":"ContainerStarted","Data":"6851549b320dec0a0e9a340d874ea7565225759ff8d9e78185b654389e19698f"} Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.306503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xjr\" (UniqueName: \"kubernetes.io/projected/72cc10e2-06e7-4827-b787-3a3d9c2566a5-kube-api-access-52xjr\") pod \"machine-config-controller-84d6567774-dftsm\" (UID: \"72cc10e2-06e7-4827-b787-3a3d9c2566a5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.308476 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9rz\" (UniqueName: \"kubernetes.io/projected/199db3af-ca8b-4ae4-8adf-46a0facb2d55-kube-api-access-pc9rz\") pod \"openshift-controller-manager-operator-756b6f6bc6-htlg4\" (UID: \"199db3af-ca8b-4ae4-8adf-46a0facb2d55\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.320426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqj9m\" (UniqueName: \"kubernetes.io/projected/c221c29b-d053-4b03-a758-ff1f5fada663-kube-api-access-xqj9m\") pod \"dns-default-65k7d\" (UID: \"c221c29b-d053-4b03-a758-ff1f5fada663\") " pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.331656 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.339199 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnmd\" (UniqueName: \"kubernetes.io/projected/ed7b294c-46b8-4519-b97a-63f8c24d8cf0-kube-api-access-4wnmd\") pod \"packageserver-d55dfcdfc-n7v94\" (UID: \"ed7b294c-46b8-4519-b97a-63f8c24d8cf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.349132 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.349525 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.849508253 +0000 UTC m=+137.598478499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.360329 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvht9\" (UniqueName: \"kubernetes.io/projected/585f1543-d912-4485-b645-3c818242f920-kube-api-access-qvht9\") pod \"machine-config-server-mtcwc\" (UID: \"585f1543-d912-4485-b645-3c818242f920\") " pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.379284 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4gz\" (UniqueName: \"kubernetes.io/projected/6defa455-47dd-4d1f-a77d-a3a4617df1b8-kube-api-access-tk4gz\") pod \"multus-admission-controller-857f4d67dd-5mlg6\" (UID: \"6defa455-47dd-4d1f-a77d-a3a4617df1b8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.399614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9kn\" (UniqueName: \"kubernetes.io/projected/9c3326f1-6d06-4219-8ac1-5aa424b3e1a4-kube-api-access-th9kn\") pod \"openshift-config-operator-7777fb866f-zp9d5\" (UID: \"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.419140 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.420973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxht4\" (UniqueName: \"kubernetes.io/projected/cd4168b6-1d83-48c4-95f1-88b04d773564-kube-api-access-qxht4\") pod \"ingress-canary-xhhgn\" (UID: \"cd4168b6-1d83-48c4-95f1-88b04d773564\") " pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.426978 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.449683 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.449833 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.949811037 +0000 UTC m=+137.698781293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.449956 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.450287 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:44.950274921 +0000 UTC m=+137.699245177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.469066 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.469193 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.482591 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.489237 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.518623 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.535322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.551294 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.551398 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.051374638 +0000 UTC m=+137.800344874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.551601 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.551956 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.051945135 +0000 UTC m=+137.800915371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.568131 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.595883 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xhhgn" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.603432 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mtcwc" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.652914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.653196 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.153177696 +0000 UTC m=+137.902147942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.653525 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.653835 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.153824665 +0000 UTC m=+137.902794921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.755085 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.755402 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.255387957 +0000 UTC m=+138.004358193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.758302 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhlg\" (UniqueName: \"kubernetes.io/projected/f7098953-20ce-4f6d-a04e-c79d2811ecd6-kube-api-access-gmhlg\") pod \"csi-hostpathplugin-tnfw8\" (UID: \"f7098953-20ce-4f6d-a04e-c79d2811ecd6\") " pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.762494 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5k2\" (UniqueName: \"kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2\") pod \"marketplace-operator-79b997595-9mslz\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.765412 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjxs\" (UniqueName: \"kubernetes.io/projected/4035ba65-b9bf-4363-96b4-ee3bcfd55988-kube-api-access-8kjxs\") pod \"package-server-manager-789f6589d5-chjwd\" (UID: \"4035ba65-b9bf-4363-96b4-ee3bcfd55988\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.794833 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.806224 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bg9k"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.808681 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.808736 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mdbfw"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.813904 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.813960 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.824867 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.827793 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kj2rg"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.837315 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5646w"] Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.857194 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.858040 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.358023259 +0000 UTC m=+138.106993495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.926249 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.958585 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.959482 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.459438766 +0000 UTC m=+138.208409002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: W1203 14:08:44.960166 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cbc1305_4f62_4db0_85ac_47bf78c2ae85.slice/crio-973bcf05523c5699d2d9bb3dcd21cc21fd493a686700fb90d52294b3b23e25a2 WatchSource:0}: Error finding container 973bcf05523c5699d2d9bb3dcd21cc21fd493a686700fb90d52294b3b23e25a2: Status 404 returned error can't find the container with id 973bcf05523c5699d2d9bb3dcd21cc21fd493a686700fb90d52294b3b23e25a2 Dec 03 14:08:44 crc kubenswrapper[5004]: I1203 14:08:44.961407 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:44 crc kubenswrapper[5004]: E1203 14:08:44.962245 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.462231038 +0000 UTC m=+138.211201264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:44 crc kubenswrapper[5004]: W1203 14:08:44.974158 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52c9fcc_c539_4037_9bae_810fecabe628.slice/crio-e3b6d9264af027f74287e422e45dd3cada1b94e2111dee03f0d5319e9aaa6c2f WatchSource:0}: Error finding container e3b6d9264af027f74287e422e45dd3cada1b94e2111dee03f0d5319e9aaa6c2f: Status 404 returned error can't find the container with id e3b6d9264af027f74287e422e45dd3cada1b94e2111dee03f0d5319e9aaa6c2f Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.048897 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.062336 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.062523 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.562496642 +0000 UTC m=+138.311466878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.062583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.063351 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.563343436 +0000 UTC m=+138.312313672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.163925 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.164070 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.664047622 +0000 UTC m=+138.413017868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.164219 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.164588 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.664576108 +0000 UTC m=+138.413546344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.252626 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.266744 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.267125 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.767095187 +0000 UTC m=+138.516065423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.269144 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.275119 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.311167 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" event={"ID":"ae44e9f6-1abb-4d46-9605-4c51579c6933","Type":"ContainerStarted","Data":"b924a49da3d43b534981bc312d4b5c30083fa2571f1d42fedb46b0040e86e304"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.313352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" event={"ID":"e52c9fcc-c539-4037-9bae-810fecabe628","Type":"ContainerStarted","Data":"e3b6d9264af027f74287e422e45dd3cada1b94e2111dee03f0d5319e9aaa6c2f"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.314321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" event={"ID":"a887d450-ffa8-4b30-98db-2e223c46b134","Type":"ContainerStarted","Data":"2f6a01203fb895109cc87c2962da2da5721aa3c175115cc41885c533369778fc"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.315690 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" event={"ID":"067d381c-1dc8-40d0-880e-8b1d95cfef3e","Type":"ContainerStarted","Data":"2497cd3ca500a3464163d68a43753d271d372516b5782ce906ef853e47ec7972"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.316464 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" event={"ID":"d787a412-6039-41df-9007-e70b05b958a4","Type":"ContainerStarted","Data":"26a58b0863fbb2f2a03e920d067dde474573d75bfb8e1639fa7ef839d5d3e42c"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.317245 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5646w" event={"ID":"8c89b35e-1cbc-45b2-b90b-ae778d622bb9","Type":"ContainerStarted","Data":"42d06c7a8d634ce65df9b0df5483c4fd08adbaf218b83ebfa792d7cc039393d6"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.319520 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" event={"ID":"9cbc1305-4f62-4db0-85ac-47bf78c2ae85","Type":"ContainerStarted","Data":"973bcf05523c5699d2d9bb3dcd21cc21fd493a686700fb90d52294b3b23e25a2"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.320424 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" event={"ID":"ea4dcd98-ce38-4c3e-93d2-9d714f509954","Type":"ContainerStarted","Data":"bbefa38104dc0b72f3390aedaa00bcab33f49db232354562baf9d634eb4f09f4"} Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.421374 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.421943 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:45.921923611 +0000 UTC m=+138.670893847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: W1203 14:08:45.481021 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585f1543_d912_4485_b645_3c818242f920.slice/crio-90bf752f7de98c3cc19f81f2cebc4cbed2590ef3c99af057b6f431f3fb904786 WatchSource:0}: Error finding container 90bf752f7de98c3cc19f81f2cebc4cbed2590ef3c99af057b6f431f3fb904786: Status 404 returned error can't find the container with id 90bf752f7de98c3cc19f81f2cebc4cbed2590ef3c99af057b6f431f3fb904786 Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.525736 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.526802 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.02678441 +0000 UTC m=+138.775754646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.628459 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.631339 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.131316609 +0000 UTC m=+138.880286855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.682295 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.696494 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.696774 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q8xqx"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.706274 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.733445 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.733725 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.233681873 +0000 UTC m=+138.982652109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.733847 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.734399 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.234390204 +0000 UTC m=+138.983360440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: W1203 14:08:45.768618 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0b812e_08c2_4e3f_bb8e_e7bc314e7533.slice/crio-bc4e6f3620a0feb8bb2123b529508c02e278eaba3305fa85fccd290ee6a2e527 WatchSource:0}: Error finding container bc4e6f3620a0feb8bb2123b529508c02e278eaba3305fa85fccd290ee6a2e527: Status 404 returned error can't find the container with id bc4e6f3620a0feb8bb2123b529508c02e278eaba3305fa85fccd290ee6a2e527 Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.821361 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.834821 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.835337 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.335320677 +0000 UTC m=+139.084290913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:45 crc kubenswrapper[5004]: W1203 14:08:45.859779 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d600ee4_d7d6_4478_bf62_1383c0f9b35c.slice/crio-cd8054762d4243de70386721147ea2e499b737a52fb589afa99403e1489dbb57 WatchSource:0}: Error finding container cd8054762d4243de70386721147ea2e499b737a52fb589afa99403e1489dbb57: Status 404 returned error can't find the container with id cd8054762d4243de70386721147ea2e499b737a52fb589afa99403e1489dbb57 Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.901062 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.916576 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d5x7k"] Dec 03 14:08:45 crc kubenswrapper[5004]: I1203 14:08:45.938028 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:45 crc kubenswrapper[5004]: E1203 14:08:45.938388 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.438375342 +0000 UTC m=+139.187345578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.003654 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd598385d_7b3b_4ac4_be9d_8523a0a14bd0.slice/crio-ec43e8ac0e2e0146135cb726676323d0eef0baa1ee61828b4e913bd8beb2e023 WatchSource:0}: Error finding container ec43e8ac0e2e0146135cb726676323d0eef0baa1ee61828b4e913bd8beb2e023: Status 404 returned error can't find the container with id ec43e8ac0e2e0146135cb726676323d0eef0baa1ee61828b4e913bd8beb2e023 Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.014409 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.027120 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.035308 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.039055 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.040360 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.040515 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.54049272 +0000 UTC m=+139.289462956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.040633 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.040909 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.540898301 +0000 UTC m=+139.289868537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.041939 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.052583 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp"] Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.067139 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf986649e_61c8_4c67_beb3_edc5dc4e4fd9.slice/crio-6e1510c9fa2b2a6a00415caa6ce3f64f73ced71b7e1bed39d8316aa92647c833 WatchSource:0}: Error finding container 6e1510c9fa2b2a6a00415caa6ce3f64f73ced71b7e1bed39d8316aa92647c833: Status 404 returned error can't find the container with id 6e1510c9fa2b2a6a00415caa6ce3f64f73ced71b7e1bed39d8316aa92647c833 Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.092706 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199db3af_ca8b_4ae4_8adf_46a0facb2d55.slice/crio-838604305c8142f652a8ce12141e3bda491a88174f5fc0d147a6aaf9d242749c WatchSource:0}: Error finding container 838604305c8142f652a8ce12141e3bda491a88174f5fc0d147a6aaf9d242749c: Status 404 returned error can't find the container with id 838604305c8142f652a8ce12141e3bda491a88174f5fc0d147a6aaf9d242749c Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.094452 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf6a96c2_22d0_4efd_9df0_b6ed4dddb2c8.slice/crio-6feb6a87076e18e1735dba99a9c7f977616139cec4acee7ea5f8e10f31ef4243 WatchSource:0}: Error finding container 6feb6a87076e18e1735dba99a9c7f977616139cec4acee7ea5f8e10f31ef4243: Status 404 returned error can't find the container with id 6feb6a87076e18e1735dba99a9c7f977616139cec4acee7ea5f8e10f31ef4243 Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.099383 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc452a836_d7b1_45d5_b07e_715591179f58.slice/crio-19b9e41fdb973cb3baeaab43793fd4a3b6106b85d647b9ee6235f6a233da4128 WatchSource:0}: Error finding container 19b9e41fdb973cb3baeaab43793fd4a3b6106b85d647b9ee6235f6a233da4128: Status 404 returned error can't find the container with id 19b9e41fdb973cb3baeaab43793fd4a3b6106b85d647b9ee6235f6a233da4128 Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.103442 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84639727_6aec_44af_a590_fc6f6a11ba3d.slice/crio-e445be1da1c95ac285943ec56d6de483bfe81468218e17fca7f30b4ec2a1562c WatchSource:0}: Error finding container e445be1da1c95ac285943ec56d6de483bfe81468218e17fca7f30b4ec2a1562c: Status 404 returned error can't find the container with id e445be1da1c95ac285943ec56d6de483bfe81468218e17fca7f30b4ec2a1562c Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.142511 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.144748 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.644689208 +0000 UTC m=+139.393659444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.146662 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.147201 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.647185461 +0000 UTC m=+139.396155697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.247945 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.248602 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.74830145 +0000 UTC m=+139.497271686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.249401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.250213 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.750198775 +0000 UTC m=+139.499169011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.337266 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" event={"ID":"45ef6725-be2e-4fac-8158-4322a766ac08","Type":"ContainerStarted","Data":"17301cab85fffa8f896550cfcf9b2c939bba524f4ee1dfd0f278f89ca9659532"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.343215 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" event={"ID":"1b84a4a2-68c0-4b25-90c0-78e439a258a0","Type":"ContainerStarted","Data":"af87a1dea5a0092c35c45c4705831eadd9e97899312b7658dda6581ce3f218b2"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.347455 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" event={"ID":"958edf90-36c2-4be7-b1fc-b35607b151e4","Type":"ContainerStarted","Data":"7a8f48e4030e63b26d4ea96fdfd7f4fd029830434a7ce3566f63745158de92b2"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.350468 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.350938 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.850921312 +0000 UTC m=+139.599891548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.388937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" event={"ID":"067d381c-1dc8-40d0-880e-8b1d95cfef3e","Type":"ContainerStarted","Data":"a9f4dea7dedf2d9fcf75582faa7d1a82673a796a31efe05ac424e0df9e7ffd16"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.424138 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" event={"ID":"6d600ee4-d7d6-4478-bf62-1383c0f9b35c","Type":"ContainerStarted","Data":"cd8054762d4243de70386721147ea2e499b737a52fb589afa99403e1489dbb57"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.440666 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k6knf" podStartSLOduration=121.440643946 podStartE2EDuration="2m1.440643946s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.409951555 +0000 UTC m=+139.158921791" watchObservedRunningTime="2025-12-03 14:08:46.440643946 +0000 UTC m=+139.189614182" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.441006 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.451654 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.452065 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:46.95205299 +0000 UTC m=+139.701023226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.453159 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" event={"ID":"9de87900-83eb-4764-b478-959ab83fb572","Type":"ContainerStarted","Data":"dcd6ea1e1e29ffa7ec59ca688ce41e465743be2438b7f50dd57267ebf74f3b1d"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.453400 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" event={"ID":"9de87900-83eb-4764-b478-959ab83fb572","Type":"ContainerStarted","Data":"62f095615022fa8bd0a5b6a04b6ceca666a63187e4baca038bf847d0ad1824bd"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.462989 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" podStartSLOduration=120.462971501 podStartE2EDuration="2m0.462971501s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.455921594 +0000 UTC m=+139.204891830" watchObservedRunningTime="2025-12-03 14:08:46.462971501 +0000 UTC m=+139.211941737" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.472505 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" event={"ID":"c452a836-d7b1-45d5-b07e-715591179f58","Type":"ContainerStarted","Data":"19b9e41fdb973cb3baeaab43793fd4a3b6106b85d647b9ee6235f6a233da4128"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.475039 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" event={"ID":"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533","Type":"ContainerStarted","Data":"a0ca0d0630cf929ef0dac09265ca5add60a8888c7328399acbfad960e91c9a25"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.475092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" event={"ID":"ff0b812e-08c2-4e3f-bb8e-e7bc314e7533","Type":"ContainerStarted","Data":"bc4e6f3620a0feb8bb2123b529508c02e278eaba3305fa85fccd290ee6a2e527"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.476145 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.489706 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mtcwc" event={"ID":"585f1543-d912-4485-b645-3c818242f920","Type":"ContainerStarted","Data":"1fce8448704818b8007094fbbabb5613316f0bfe6af3975e95148918a88b40a6"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.489747 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mtcwc" event={"ID":"585f1543-d912-4485-b645-3c818242f920","Type":"ContainerStarted","Data":"90bf752f7de98c3cc19f81f2cebc4cbed2590ef3c99af057b6f431f3fb904786"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.502812 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lcc4c" podStartSLOduration=120.50279588 podStartE2EDuration="2m0.50279588s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.48678418 +0000 UTC m=+139.235754416" watchObservedRunningTime="2025-12-03 14:08:46.50279588 +0000 UTC m=+139.251766116" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.503210 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.511653 5004 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-knk6r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.512035 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" podUID="ff0b812e-08c2-4e3f-bb8e-e7bc314e7533" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.560512 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" podStartSLOduration=120.560487453 podStartE2EDuration="2m0.560487453s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.523492217 +0000 UTC m=+139.272462453" watchObservedRunningTime="2025-12-03 14:08:46.560487453 +0000 UTC m=+139.309457689" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.561196 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.562633 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.062312177 +0000 UTC m=+139.811282413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.579611 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mtcwc" podStartSLOduration=5.579589054 podStartE2EDuration="5.579589054s" podCreationTimestamp="2025-12-03 14:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.562379569 +0000 UTC m=+139.311349805" watchObservedRunningTime="2025-12-03 14:08:46.579589054 +0000 UTC m=+139.328559290" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.580002 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.582607 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xhhgn"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.591328 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" event={"ID":"f986649e-61c8-4c67-beb3-edc5dc4e4fd9","Type":"ContainerStarted","Data":"6e1510c9fa2b2a6a00415caa6ce3f64f73ced71b7e1bed39d8316aa92647c833"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.598515 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5mlg6"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.607502 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwxpp"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.617583 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.633911 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65k7d"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.634215 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vcl7"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.646904 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.653270 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.655806 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.657406 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tnfw8"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.666214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.666556 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.166538646 +0000 UTC m=+139.915508882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.674166 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.767186 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.767620 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.267602583 +0000 UTC m=+140.016572829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: W1203 14:08:46.816311 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb95056_cffc_433a_a3a7_17ad434cf41f.slice/crio-5b5ee51da3c2780c55e24bfc2a9e49329a2c70a7b93f6badb6d1ba52290ceaca WatchSource:0}: Error finding container 5b5ee51da3c2780c55e24bfc2a9e49329a2c70a7b93f6badb6d1ba52290ceaca: Status 404 returned error can't find the container with id 5b5ee51da3c2780c55e24bfc2a9e49329a2c70a7b93f6badb6d1ba52290ceaca Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.869239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.869649 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.369634618 +0000 UTC m=+140.118604854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.912081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" event={"ID":"dfbd3e28-71fd-4412-8218-9ca072542838","Type":"ContainerStarted","Data":"5891da43ca37e105271fe615a09869ff7c2f94b5575c3179ef131d73b03ef3bf"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.912123 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" event={"ID":"dfbd3e28-71fd-4412-8218-9ca072542838","Type":"ContainerStarted","Data":"897c1991f97ca5b3a8fc7488041d8550f408a4431c9579d61c0dd0de9d79fb15"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.940402 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ltc99" event={"ID":"8704c023-6680-4430-a7e7-b4aa5a76d365","Type":"ContainerStarted","Data":"9a04683526ec9f5e336c397da072a94bef0ddf7ab3b7b072e30dfec442448293"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.947351 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" event={"ID":"84639727-6aec-44af-a590-fc6f6a11ba3d","Type":"ContainerStarted","Data":"e445be1da1c95ac285943ec56d6de483bfe81468218e17fca7f30b4ec2a1562c"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.951694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" event={"ID":"ea4dcd98-ce38-4c3e-93d2-9d714f509954","Type":"ContainerStarted","Data":"c71b2469f4b2fb6e925906a33c711239de70b48537303a074c8c7d9bce6e964a"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.951801 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.958025 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" event={"ID":"f13001d1-8878-499b-87c3-7730c30b1a5c","Type":"ContainerStarted","Data":"1d41724f14524304ef28ba6cf646bc31b73b38d0aef9d5edc03d04ebb4602224"} Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.963882 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.964791 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ltc99" podStartSLOduration=120.964772801 podStartE2EDuration="2m0.964772801s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.96406482 +0000 UTC m=+139.713035066" watchObservedRunningTime="2025-12-03 14:08:46.964772801 +0000 UTC m=+139.713743037" Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.971048 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:46 crc kubenswrapper[5004]: E1203 14:08:46.971682 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.471667723 +0000 UTC m=+140.220637949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:46 crc kubenswrapper[5004]: I1203 14:08:46.992729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" event={"ID":"e52c9fcc-c539-4037-9bae-810fecabe628","Type":"ContainerStarted","Data":"2c492585302e10a67f9b759caf68c88668bc67371440f60a9704432a03156102"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.020681 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wm25j" podStartSLOduration=122.020662911 podStartE2EDuration="2m2.020662911s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.020586209 +0000 UTC m=+139.769556455" watchObservedRunningTime="2025-12-03 14:08:47.020662911 +0000 UTC m=+139.769633157" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.020811 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kj2rg" podStartSLOduration=121.020805116 podStartE2EDuration="2m1.020805116s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:46.991941378 +0000 UTC m=+139.740911604" watchObservedRunningTime="2025-12-03 14:08:47.020805116 +0000 UTC m=+139.769775352" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.025288 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" event={"ID":"d787a412-6039-41df-9007-e70b05b958a4","Type":"ContainerStarted","Data":"b34f3a007c74814928362cc5aa1c9267288a5726c52ec6e673e1fc256362e172"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.025678 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.037136 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" event={"ID":"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8","Type":"ContainerStarted","Data":"6feb6a87076e18e1735dba99a9c7f977616139cec4acee7ea5f8e10f31ef4243"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.062505 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" event={"ID":"199db3af-ca8b-4ae4-8adf-46a0facb2d55","Type":"ContainerStarted","Data":"838604305c8142f652a8ce12141e3bda491a88174f5fc0d147a6aaf9d242749c"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.074694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.075160 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.077527 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.57750859 +0000 UTC m=+140.326478826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.087426 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" podStartSLOduration=121.08740529 podStartE2EDuration="2m1.08740529s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.065429105 +0000 UTC m=+139.814399341" watchObservedRunningTime="2025-12-03 14:08:47.08740529 +0000 UTC m=+139.836375526" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.097077 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" event={"ID":"d598385d-7b3b-4ac4-be9d-8523a0a14bd0","Type":"ContainerStarted","Data":"ec43e8ac0e2e0146135cb726676323d0eef0baa1ee61828b4e913bd8beb2e023"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.103139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" event={"ID":"5b676595-3778-4703-a0b1-654d54d007fc","Type":"ContainerStarted","Data":"050b9bca68fcae5a994bdaf8d08e2c090d6365b6d97d44df0714cfc730282002"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.119757 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" event={"ID":"eaf11e19-667b-4d30-b6fa-71af6a5a1182","Type":"ContainerStarted","Data":"9959c358ed954deb60de9829cc56464a531700daee05772200147132879b7172"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.128265 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" event={"ID":"ae44e9f6-1abb-4d46-9605-4c51579c6933","Type":"ContainerStarted","Data":"05cb80aa26d93d5e03226c0a6615848293e809fa59cf93e455d4e9e718182164"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.149289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5646w" event={"ID":"8c89b35e-1cbc-45b2-b90b-ae778d622bb9","Type":"ContainerStarted","Data":"ef051fabcba6042c546280fc33835eceded762fc7cccbd5b43ba15a1491d4f8f"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.151205 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.155044 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-5646w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.155094 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5646w" podUID="8c89b35e-1cbc-45b2-b90b-ae778d622bb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.158294 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" podStartSLOduration=121.158277761 podStartE2EDuration="2m1.158277761s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.156707435 +0000 UTC m=+139.905677671" watchObservedRunningTime="2025-12-03 14:08:47.158277761 +0000 UTC m=+139.907248007" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.160888 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ll8wz" event={"ID":"8eede088-bf0c-48cb-b158-d58aa0c58eb0","Type":"ContainerStarted","Data":"8635654d975694c22781cbfab2f1363c8a2de8b8541b3f81fb91de7bc8c4e7cf"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.175990 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.177985 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.677965839 +0000 UTC m=+140.426936075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.199743 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bg9k" podStartSLOduration=122.199725998 podStartE2EDuration="2m2.199725998s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.198299396 +0000 UTC m=+139.947269632" watchObservedRunningTime="2025-12-03 14:08:47.199725998 +0000 UTC m=+139.948696234" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.221486 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" event={"ID":"a887d450-ffa8-4b30-98db-2e223c46b134","Type":"ContainerStarted","Data":"316905a64c70622c79c92d8fc3d7a99e329f3fee444664491206ff3a29a03806"} Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.236687 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5646w" podStartSLOduration=121.236672382 podStartE2EDuration="2m1.236672382s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.235408195 +0000 UTC m=+139.984378441" watchObservedRunningTime="2025-12-03 14:08:47.236672382 +0000 UTC m=+139.985642628" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.270579 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" podStartSLOduration=121.270558607 podStartE2EDuration="2m1.270558607s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.269388432 +0000 UTC m=+140.018358678" watchObservedRunningTime="2025-12-03 14:08:47.270558607 +0000 UTC m=+140.019528833" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.278476 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.280270 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.780251181 +0000 UTC m=+140.529221487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.317209 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ll8wz" podStartSLOduration=121.317190606 podStartE2EDuration="2m1.317190606s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:47.315492086 +0000 UTC m=+140.064462332" watchObservedRunningTime="2025-12-03 14:08:47.317190606 +0000 UTC m=+140.066160832" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.379609 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.379971 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.879951148 +0000 UTC m=+140.628921384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.486766 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.487558 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:47.987541866 +0000 UTC m=+140.736512102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.588341 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.588746 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.088715686 +0000 UTC m=+140.837685922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.690636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.691066 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.19105141 +0000 UTC m=+140.940021646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.798653 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.798802 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.298776981 +0000 UTC m=+141.047747217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.799014 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.800654 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.800674 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.300659376 +0000 UTC m=+141.049629612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.818575 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:47 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:47 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:47 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.818817 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:47 crc kubenswrapper[5004]: I1203 14:08:47.902473 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:47 crc kubenswrapper[5004]: E1203 14:08:47.902773 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.402756553 +0000 UTC m=+141.151726789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.004622 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.005067 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.505053246 +0000 UTC m=+141.254023482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.105547 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.106034 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.60601575 +0000 UTC m=+141.354985996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.206624 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.206974 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.706962583 +0000 UTC m=+141.455932819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.240679 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" event={"ID":"4035ba65-b9bf-4363-96b4-ee3bcfd55988","Type":"ContainerStarted","Data":"c45d2592a662cac69773f043064d8b3f4db05530017c30389628472f784dfbab"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.245071 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" event={"ID":"f7098953-20ce-4f6d-a04e-c79d2811ecd6","Type":"ContainerStarted","Data":"4c1b01290374667eb5abca2933f0447053065a7e20d6f5428f67354a1123e546"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.254126 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" event={"ID":"c452a836-d7b1-45d5-b07e-715591179f58","Type":"ContainerStarted","Data":"b902fc8985fae952e0b578516a5ada68a4943a35287409b57221f008224d04d4"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.263279 5004 generic.go:334] "Generic (PLEG): container finished" podID="6d600ee4-d7d6-4478-bf62-1383c0f9b35c" containerID="96b3f38d78c3c818551073467b29a09f4978f9ef1fe29266abd051812a2b61e4" exitCode=0 Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.264033 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" event={"ID":"6d600ee4-d7d6-4478-bf62-1383c0f9b35c","Type":"ContainerDied","Data":"96b3f38d78c3c818551073467b29a09f4978f9ef1fe29266abd051812a2b61e4"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.272044 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" event={"ID":"6defa455-47dd-4d1f-a77d-a3a4617df1b8","Type":"ContainerStarted","Data":"13fdd5544e075d48d813d90236d7110704b1575e386722fa0a57c86bd570df86"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.297196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mdbfw" event={"ID":"a887d450-ffa8-4b30-98db-2e223c46b134","Type":"ContainerStarted","Data":"e4138d9866a4dd0cfe174e2e801dc36017b479f341af8203e8ea36d7385e16d6"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.302707 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ll8wz" event={"ID":"8eede088-bf0c-48cb-b158-d58aa0c58eb0","Type":"ContainerStarted","Data":"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.305532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" event={"ID":"f986649e-61c8-4c67-beb3-edc5dc4e4fd9","Type":"ContainerStarted","Data":"6a7862b353fd446c37d96b67d305eb718dd9e896447ed69159dcca5c9f028780"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.308874 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.309324 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.809308107 +0000 UTC m=+141.558278343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.317338 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sz8cd" event={"ID":"9cbc1305-4f62-4db0-85ac-47bf78c2ae85","Type":"ContainerStarted","Data":"89852f21f753e545f51547964f722661c99cafcf287dcb69c1a6d6a88403503e"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.321422 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" event={"ID":"1b84a4a2-68c0-4b25-90c0-78e439a258a0","Type":"ContainerStarted","Data":"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.324468 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.332913 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" event={"ID":"84639727-6aec-44af-a590-fc6f6a11ba3d","Type":"ContainerStarted","Data":"4cf7976eb290d371ac8fb0988e1b431c2fd34fc278d3242f0260f166a8474dc8"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.338712 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.344514 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" event={"ID":"72cc10e2-06e7-4827-b787-3a3d9c2566a5","Type":"ContainerStarted","Data":"46ffb45ff86707efd00aef9da7dcd58e5ed37866fe9b755635831a8caafeaaf4"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.344624 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" event={"ID":"72cc10e2-06e7-4827-b787-3a3d9c2566a5","Type":"ContainerStarted","Data":"7691f1e810ed33d6154c5ea399b22763f7ecb2e90d15e8a7d81348af263cc785"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.385370 5004 generic.go:334] "Generic (PLEG): container finished" podID="d598385d-7b3b-4ac4-be9d-8523a0a14bd0" containerID="0de96f701ebd72eddb4bad33c49a42b63fe02b9fb58a0d9578f7799f36ade51d" exitCode=0 Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.385489 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" event={"ID":"d598385d-7b3b-4ac4-be9d-8523a0a14bd0","Type":"ContainerDied","Data":"0de96f701ebd72eddb4bad33c49a42b63fe02b9fb58a0d9578f7799f36ade51d"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.411455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.419636 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" event={"ID":"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f","Type":"ContainerStarted","Data":"c368fe6caa51db0f317503c565fcb9367433fd1e764205cb22331504427926cf"} Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.429744 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:48.929713851 +0000 UTC m=+141.678684087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.439814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" event={"ID":"f13001d1-8878-499b-87c3-7730c30b1a5c","Type":"ContainerStarted","Data":"f4f2b43c9ed58b197f8307b302b350011e53da3de9a54ab046a33ae53c012665"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.472768 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.477981 5004 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vxwzk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.478045 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.513524 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" event={"ID":"bcb95056-cffc-433a-a3a7-17ad434cf41f","Type":"ContainerStarted","Data":"5b5ee51da3c2780c55e24bfc2a9e49329a2c70a7b93f6badb6d1ba52290ceaca"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.514202 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.515553 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.01552766 +0000 UTC m=+141.764497956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.544143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" event={"ID":"926e2906-448f-4006-a186-2b45932f51e6","Type":"ContainerStarted","Data":"34911db69db885742aa6cf9a9c05ad1e0857640e8a6f3a44360cd85d81ea214a"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.566452 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xhhgn" event={"ID":"cd4168b6-1d83-48c4-95f1-88b04d773564","Type":"ContainerStarted","Data":"f39cabea45b051fd70c58af851dc119e05d97245bdb26df45d5dc108a33ed68d"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.591407 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" event={"ID":"ed7b294c-46b8-4519-b97a-63f8c24d8cf0","Type":"ContainerStarted","Data":"fd4452d2d0e88418ab96271ce976ae489c4391678269e52dd0a26aebb22faa77"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.592168 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.597002 5004 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-n7v94 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:5443/healthz\": dial tcp 10.217.0.14:5443: connect: connection refused" start-of-body= Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.597049 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" podUID="ed7b294c-46b8-4519-b97a-63f8c24d8cf0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.14:5443/healthz\": dial tcp 10.217.0.14:5443: connect: connection refused" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.597296 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" event={"ID":"958edf90-36c2-4be7-b1fc-b35607b151e4","Type":"ContainerStarted","Data":"f32c40d92b065accceabfa52d9b61d5c1d44d034787794482d86a0e56e725db6"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.618466 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.618770 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.118758241 +0000 UTC m=+141.867728477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.640325 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65k7d" event={"ID":"c221c29b-d053-4b03-a758-ff1f5fada663","Type":"ContainerStarted","Data":"df482ed3d7494aa9d81679e6d3a6561adf10ca81927235d6e7a7dd35460795dd"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.642926 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q8xqx" event={"ID":"eaf11e19-667b-4d30-b6fa-71af6a5a1182","Type":"ContainerStarted","Data":"d9b8362856c934ed40ddb88e9ff6a6d07bd7c31d171879d60244bf23d2309322"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.669843 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" event={"ID":"45ef6725-be2e-4fac-8158-4322a766ac08","Type":"ContainerStarted","Data":"b337e6508a313fde61c4abcdd275baa76cd07c5e2cd5e52e6a1a7c0667eac0dd"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.669905 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" event={"ID":"45ef6725-be2e-4fac-8158-4322a766ac08","Type":"ContainerStarted","Data":"c9216c455a40981c8241453ebb403a08a4e471311683f36ef34e7a94eb62a725"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.705123 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lbc4g" podStartSLOduration=122.705103895 podStartE2EDuration="2m2.705103895s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:48.70254269 +0000 UTC m=+141.451512926" watchObservedRunningTime="2025-12-03 14:08:48.705103895 +0000 UTC m=+141.454074131" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.706970 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerStarted","Data":"0379b10d5f11b7b16fecd08cb2440cb2aaea66382f879e784009d83ae9e4428e"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.708416 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.711058 5004 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9mslz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.711095 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.716544 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" event={"ID":"5b676595-3778-4703-a0b1-654d54d007fc","Type":"ContainerStarted","Data":"c4678feae067e7208c5b83dc1b2b7cd4915a91c284dda6658a72b4bb4c963f69"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.719467 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.721180 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.221159007 +0000 UTC m=+141.970129253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.736330 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" event={"ID":"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8","Type":"ContainerStarted","Data":"ffb072268e4d3540f7876639a8c81b18776da23e0222e74cddfaac1a27d0d036"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.739908 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" event={"ID":"199db3af-ca8b-4ae4-8adf-46a0facb2d55","Type":"ContainerStarted","Data":"9b97bf83162fc5c035e6e7b9a9399f1340410d22bf1d4fcdd1282e36fd9374d6"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.745919 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" event={"ID":"dfbd3e28-71fd-4412-8218-9ca072542838","Type":"ContainerStarted","Data":"9dc45cd536fd14d7b2a9fcf6acbc9059d09151b0db12518867e0efb785af1d9b"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.805664 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" event={"ID":"77632d37-a94f-4bc0-a07c-7880d70c7d5f","Type":"ContainerStarted","Data":"ffeb0f93ac1feb2c63e907b89360d06a3b9df0765569c5e9b35cacf0bada0002"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.825332 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:48 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:48 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:48 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.825397 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.831664 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.833589 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.333575026 +0000 UTC m=+142.082545262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.840114 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" event={"ID":"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4","Type":"ContainerStarted","Data":"8975c0a13c88cf2f0caa66b3e0a6d1d2c6aae9b6005c55e68d787f69b3578671"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.912378 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" podStartSLOduration=122.912357769 podStartE2EDuration="2m2.912357769s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:48.906689462 +0000 UTC m=+141.655659698" watchObservedRunningTime="2025-12-03 14:08:48.912357769 +0000 UTC m=+141.661328005" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.920073 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" event={"ID":"46f500f6-07e9-4242-9d25-31a3fc4e5a6d","Type":"ContainerStarted","Data":"3df6bf082e66ffb4a1de82a8a7bcb72774d057a71873c1d7e9baa64dca42be42"} Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.921076 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-5646w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.921370 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5646w" podUID="8c89b35e-1cbc-45b2-b90b-ae778d622bb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.922352 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" podStartSLOduration=123.922333462 podStartE2EDuration="2m3.922333462s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:48.814474806 +0000 UTC m=+141.563445042" watchObservedRunningTime="2025-12-03 14:08:48.922333462 +0000 UTC m=+141.671303698" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.938107 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:48 crc kubenswrapper[5004]: E1203 14:08:48.938534 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.438473025 +0000 UTC m=+142.187443261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.939088 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knk6r" Dec 03 14:08:48 crc kubenswrapper[5004]: I1203 14:08:48.947098 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" podStartSLOduration=122.947079958 podStartE2EDuration="2m2.947079958s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:48.946557063 +0000 UTC m=+141.695527309" watchObservedRunningTime="2025-12-03 14:08:48.947079958 +0000 UTC m=+141.696050194" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.051711 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.053445 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.55343348 +0000 UTC m=+142.302403716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.085153 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvx5j" podStartSLOduration=123.085137411 podStartE2EDuration="2m3.085137411s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.084906694 +0000 UTC m=+141.833876930" watchObservedRunningTime="2025-12-03 14:08:49.085137411 +0000 UTC m=+141.834107647" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.136084 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-swmpz" podStartSLOduration=123.136066196 podStartE2EDuration="2m3.136066196s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.117063378 +0000 UTC m=+141.866033614" watchObservedRunningTime="2025-12-03 14:08:49.136066196 +0000 UTC m=+141.885036432" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.152739 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.153348 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.653328352 +0000 UTC m=+142.402298588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.231277 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-h5gl4" podStartSLOduration=123.23126287 podStartE2EDuration="2m3.23126287s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.19651192 +0000 UTC m=+141.945482166" watchObservedRunningTime="2025-12-03 14:08:49.23126287 +0000 UTC m=+141.980233106" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.232139 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" podStartSLOduration=123.232134946 podStartE2EDuration="2m3.232134946s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.229633132 +0000 UTC m=+141.978603368" watchObservedRunningTime="2025-12-03 14:08:49.232134946 +0000 UTC m=+141.981105182" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.271112 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.271642 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.771630465 +0000 UTC m=+142.520600701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.292233 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" podStartSLOduration=123.292210139 podStartE2EDuration="2m3.292210139s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.290719045 +0000 UTC m=+142.039689281" watchObservedRunningTime="2025-12-03 14:08:49.292210139 +0000 UTC m=+142.041180375" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.323249 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" podStartSLOduration=123.32323489 podStartE2EDuration="2m3.32323489s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.320399767 +0000 UTC m=+142.069370013" watchObservedRunningTime="2025-12-03 14:08:49.32323489 +0000 UTC m=+142.072205126" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.350612 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" podStartSLOduration=123.350595773 podStartE2EDuration="2m3.350595773s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.349692326 +0000 UTC m=+142.098662562" watchObservedRunningTime="2025-12-03 14:08:49.350595773 +0000 UTC m=+142.099566009" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.380346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.380850 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.880835441 +0000 UTC m=+142.629805677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.483571 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c45j8" podStartSLOduration=123.483550826 podStartE2EDuration="2m3.483550826s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.411682816 +0000 UTC m=+142.160653052" watchObservedRunningTime="2025-12-03 14:08:49.483550826 +0000 UTC m=+142.232521062" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.483768 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.484137 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:49.984121852 +0000 UTC m=+142.733092088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.513324 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6qgs2" podStartSLOduration=123.513307509 podStartE2EDuration="2m3.513307509s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.483766772 +0000 UTC m=+142.232737008" watchObservedRunningTime="2025-12-03 14:08:49.513307509 +0000 UTC m=+142.262277745" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.513949 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-htlg4" podStartSLOduration=123.513945648 podStartE2EDuration="2m3.513945648s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:49.513787833 +0000 UTC m=+142.262758069" watchObservedRunningTime="2025-12-03 14:08:49.513945648 +0000 UTC m=+142.262915884" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.584413 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.585133 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.085117317 +0000 UTC m=+142.834087553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.685732 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.686086 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.18606723 +0000 UTC m=+142.935037466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.788508 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.788660 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.288633601 +0000 UTC m=+143.037603837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.788792 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.789152 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.289143656 +0000 UTC m=+143.038113892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.803687 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:49 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:49 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:49 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.803975 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.889536 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.889687 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.389662607 +0000 UTC m=+143.138632843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.890131 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.890529 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.390513312 +0000 UTC m=+143.139483558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.944945 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" event={"ID":"ed7b294c-46b8-4519-b97a-63f8c24d8cf0","Type":"ContainerStarted","Data":"bdee2a9093f34482302ca23d7f3da8003295627156f8072a71d854e09010e592"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.953719 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" event={"ID":"d598385d-7b3b-4ac4-be9d-8523a0a14bd0","Type":"ContainerStarted","Data":"0301d7db60b7264ed889337a22358e064286e3bab3607ac4ff133c9a4b840a5d"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.970804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" event={"ID":"4035ba65-b9bf-4363-96b4-ee3bcfd55988","Type":"ContainerStarted","Data":"8aa670a42deab05b0affe25d35a42ed4db2ccc8ea738cd7e3135f7bd2a726e84"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.971109 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" event={"ID":"4035ba65-b9bf-4363-96b4-ee3bcfd55988","Type":"ContainerStarted","Data":"bde92200cbe4e82f23bdc91df9a5c80d62bf38971cce3991e5dd74297f6a7d4c"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.971289 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.991099 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:49 crc kubenswrapper[5004]: E1203 14:08:49.991362 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.491337261 +0000 UTC m=+143.240307497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.993524 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" event={"ID":"926e2906-448f-4006-a186-2b45932f51e6","Type":"ContainerStarted","Data":"cc251954dd3fa96ac69718e3eeacdfed066b30b01020d0d880612293961acf0b"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.995725 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65k7d" event={"ID":"c221c29b-d053-4b03-a758-ff1f5fada663","Type":"ContainerStarted","Data":"e6c255a547903310812b2aefb0a5d0e2dd38e281ce8e724456812f29b1cb95f9"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.995766 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65k7d" event={"ID":"c221c29b-d053-4b03-a758-ff1f5fada663","Type":"ContainerStarted","Data":"ae7fa6ac7de507ecf58621aa75167f6439c930986a31477b9063b514cdf258fe"} Dec 03 14:08:49 crc kubenswrapper[5004]: I1203 14:08:49.996176 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.000880 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" event={"ID":"72cc10e2-06e7-4827-b787-3a3d9c2566a5","Type":"ContainerStarted","Data":"2be7f9850513b10afa78882c52168bbfb71f2ccad805b340333c3fde5350cc42"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.013538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xhhgn" event={"ID":"cd4168b6-1d83-48c4-95f1-88b04d773564","Type":"ContainerStarted","Data":"9313ed283c43dc46d5e6283403fd78716b77a812a0db68d4da61ba3f2170aff4"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.015678 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" event={"ID":"bcb95056-cffc-433a-a3a7-17ad434cf41f","Type":"ContainerStarted","Data":"f757e10f510fd0dad015f9521f555e85b0c92cec5a3581d99d5fd350fd1abddb"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.015826 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.018229 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerStarted","Data":"a17a3fbeefeb9762061857ac31ff8b7e7fa4e5a5fcbd3f222153ade5f780df51"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.023078 5004 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9mslz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.023367 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.027999 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" podStartSLOduration=124.027979967 podStartE2EDuration="2m4.027979967s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.026802022 +0000 UTC m=+142.775772258" watchObservedRunningTime="2025-12-03 14:08:50.027979967 +0000 UTC m=+142.776950203" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.040264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6pqqp" event={"ID":"5b676595-3778-4703-a0b1-654d54d007fc","Type":"ContainerStarted","Data":"fe3595d9a74edb16f15df669fe9a942003f3671059a6f6be12e3bf954240f75e"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.048463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2g5c8" event={"ID":"af6a96c2-22d0-4efd-9df0-b6ed4dddb2c8","Type":"ContainerStarted","Data":"66144ed19ac0534fa7f0be708e7f679965555c045e75f786890103ebd74e2754"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.050269 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" event={"ID":"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f","Type":"ContainerStarted","Data":"d11d97ce45bcdba61708e9322b0083ff25a7dcd5c672c72780a53313cbabbfb7"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.051990 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rwxpp" event={"ID":"46f500f6-07e9-4242-9d25-31a3fc4e5a6d","Type":"ContainerStarted","Data":"8ab4294adf50e81571938cc27747eecaaed4ade097919157e117149a78067de7"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.057041 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-65k7d" podStartSLOduration=9.057027439 podStartE2EDuration="9.057027439s" podCreationTimestamp="2025-12-03 14:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.056112353 +0000 UTC m=+142.805082599" watchObservedRunningTime="2025-12-03 14:08:50.057027439 +0000 UTC m=+142.805997675" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.058443 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" event={"ID":"6d600ee4-d7d6-4478-bf62-1383c0f9b35c","Type":"ContainerStarted","Data":"ff2b98d1989257dad1b510ce87c00638fb03429ff5ee3b70f9e165f93ba86043"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.075029 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.078167 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" event={"ID":"f7098953-20ce-4f6d-a04e-c79d2811ecd6","Type":"ContainerStarted","Data":"483e575fe61bf6e93fc5eb57e6f0aca6b3f9ee23c0823d26e0a6b26717fd6ea8"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.093332 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.096424 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.596409065 +0000 UTC m=+143.345379301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.097461 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" event={"ID":"6defa455-47dd-4d1f-a77d-a3a4617df1b8","Type":"ContainerStarted","Data":"cf989286e998ecac2fc56a325266cd45df46bff77674a7083cb289476c572eca"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.097502 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" event={"ID":"6defa455-47dd-4d1f-a77d-a3a4617df1b8","Type":"ContainerStarted","Data":"8b768e208ff4bde5ed9485387f972931f2be1d9aadf8193be86737cb49833ad4"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.107256 5004 generic.go:334] "Generic (PLEG): container finished" podID="9c3326f1-6d06-4219-8ac1-5aa424b3e1a4" containerID="ca1e080bb1f60071af591575c001908431ed366b5730491896cbc36907666bca" exitCode=0 Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.107318 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" event={"ID":"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4","Type":"ContainerDied","Data":"ca1e080bb1f60071af591575c001908431ed366b5730491896cbc36907666bca"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.126366 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-whn2l" podStartSLOduration=124.126353204 podStartE2EDuration="2m4.126353204s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.125964873 +0000 UTC m=+142.874935119" watchObservedRunningTime="2025-12-03 14:08:50.126353204 +0000 UTC m=+142.875323440" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.179592 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" event={"ID":"77632d37-a94f-4bc0-a07c-7880d70c7d5f","Type":"ContainerStarted","Data":"87d6ceb01f3e8fe23559d810982509afbdc56b99ed10c388399ac64a26323d64"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.179638 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" event={"ID":"77632d37-a94f-4bc0-a07c-7880d70c7d5f","Type":"ContainerStarted","Data":"32d2af16f82b1da5bb7c574fe79d95fabebdd1e8a3187c22dd8985b545b8cebb"} Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.192589 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-5646w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.192641 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5646w" podUID="8c89b35e-1cbc-45b2-b90b-ae778d622bb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.194844 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.195024 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.69500699 +0000 UTC m=+143.443977226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.196189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.196772 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.696750201 +0000 UTC m=+143.445720437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.204236 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dftsm" podStartSLOduration=124.20421633 podStartE2EDuration="2m4.20421633s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.199890293 +0000 UTC m=+142.948860529" watchObservedRunningTime="2025-12-03 14:08:50.20421633 +0000 UTC m=+142.953186566" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.233708 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.282130 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6vs6f" podStartSLOduration=124.282111827 podStartE2EDuration="2m4.282111827s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.255123584 +0000 UTC m=+143.004093820" watchObservedRunningTime="2025-12-03 14:08:50.282111827 +0000 UTC m=+143.031082063" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.284038 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xhhgn" podStartSLOduration=9.284026413 podStartE2EDuration="9.284026413s" podCreationTimestamp="2025-12-03 14:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.282786886 +0000 UTC m=+143.031757172" watchObservedRunningTime="2025-12-03 14:08:50.284026413 +0000 UTC m=+143.032996649" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.310250 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.312605 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.812583211 +0000 UTC m=+143.561553447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.418616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.419276 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:50.919256732 +0000 UTC m=+143.668226968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.521526 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.521919 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.021894815 +0000 UTC m=+143.770865051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.616499 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" podStartSLOduration=124.616483242 podStartE2EDuration="2m4.616483242s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.614542505 +0000 UTC m=+143.363512751" watchObservedRunningTime="2025-12-03 14:08:50.616483242 +0000 UTC m=+143.365453478" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.626311 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.626630 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.126617509 +0000 UTC m=+143.875587745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.648320 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" podStartSLOduration=125.648302386 podStartE2EDuration="2m5.648302386s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.647316827 +0000 UTC m=+143.396287073" watchObservedRunningTime="2025-12-03 14:08:50.648302386 +0000 UTC m=+143.397272622" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.721903 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9vcl7" podStartSLOduration=124.721884935 podStartE2EDuration="2m4.721884935s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.721626218 +0000 UTC m=+143.470596464" watchObservedRunningTime="2025-12-03 14:08:50.721884935 +0000 UTC m=+143.470855171" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.723311 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5mlg6" podStartSLOduration=124.723304197 podStartE2EDuration="2m4.723304197s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:50.688146935 +0000 UTC m=+143.437117181" watchObservedRunningTime="2025-12-03 14:08:50.723304197 +0000 UTC m=+143.472274433" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.729012 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.729416 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.229396676 +0000 UTC m=+143.978366912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.806385 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:50 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:50 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:50 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.806754 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.831379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.831717 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.331702479 +0000 UTC m=+144.080672715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.936799 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:50 crc kubenswrapper[5004]: E1203 14:08:50.937191 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.437174335 +0000 UTC m=+144.186144571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.950217 5004 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-n7v94 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.950295 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" podUID="ed7b294c-46b8-4519-b97a-63f8c24d8cf0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.14:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 14:08:50 crc kubenswrapper[5004]: I1203 14:08:50.985739 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.038698 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.039182 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.539167609 +0000 UTC m=+144.288137845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.140068 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.140282 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.640253006 +0000 UTC m=+144.389223242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.140354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.140825 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.640809183 +0000 UTC m=+144.389779409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.162195 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.163308 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.166319 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.213669 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" event={"ID":"d598385d-7b3b-4ac4-be9d-8523a0a14bd0","Type":"ContainerStarted","Data":"e312149767f00f7eb44154c369b0952e0dc8101b83dd38e57c88db3e0d67a7f1"} Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.215686 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" event={"ID":"f7098953-20ce-4f6d-a04e-c79d2811ecd6","Type":"ContainerStarted","Data":"9d72dddcabeaadb8dc02f5bd9ffa348dd0b239b85a69646dc394c3383801a297"} Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.221095 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" event={"ID":"9c3326f1-6d06-4219-8ac1-5aa424b3e1a4","Type":"ContainerStarted","Data":"ed2336a3a44ed981efa7319a3b04492273ca2af5259df7df7ebc18a7f56a5ac1"} Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.222662 5004 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9mslz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.222702 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.232242 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-n7v94" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.239013 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.242410 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.242563 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpvb\" (UniqueName: \"kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.242595 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.242677 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.242801 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.742785796 +0000 UTC m=+144.491756032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.274655 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" podStartSLOduration=126.274633361 podStartE2EDuration="2m6.274633361s" podCreationTimestamp="2025-12-03 14:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:51.265966546 +0000 UTC m=+144.014936782" watchObservedRunningTime="2025-12-03 14:08:51.274633361 +0000 UTC m=+144.023603597" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.301464 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" podStartSLOduration=125.301447228 podStartE2EDuration="2m5.301447228s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:51.298850152 +0000 UTC m=+144.047820388" watchObservedRunningTime="2025-12-03 14:08:51.301447228 +0000 UTC m=+144.050417464" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.344657 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpvb\" (UniqueName: \"kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.347021 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.347470 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.347898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.348163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.348241 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.84822471 +0000 UTC m=+144.597194946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.354842 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.357614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.360295 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.361233 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.371120 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.390239 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.397271 5004 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.398202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpvb\" (UniqueName: \"kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb\") pod \"community-operators-f5tn4\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.451379 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.452024 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.452168 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhw4r\" (UniqueName: \"kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.452245 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.452490 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:51.9524705 +0000 UTC m=+144.701440736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.476733 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.542093 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.543007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.553052 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.553121 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.553157 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.553195 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhw4r\" (UniqueName: \"kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.553873 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.554144 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.054130324 +0000 UTC m=+144.803100560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.554330 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.562565 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.589714 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhw4r\" (UniqueName: \"kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r\") pod \"certified-operators-lkn6v\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.653922 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.654162 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.15413169 +0000 UTC m=+144.903101936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.654295 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.654356 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.654397 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs2m7\" (UniqueName: \"kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.654429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.654729 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.154716897 +0000 UTC m=+144.903687193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.716052 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.764011 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.765052 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.766238 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.766403 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.766436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs2m7\" (UniqueName: \"kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.766459 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.766853 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.767417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.771494 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.271469244 +0000 UTC m=+145.020439480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.776907 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.777032 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.777083 5004 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T14:08:51.39729506Z","Handler":null,"Name":""} Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.786853 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.795752 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.825708 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.865803 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs2m7\" (UniqueName: \"kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7\") pod \"community-operators-l6c7p\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.866508 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.867978 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.868032 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.868125 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t8h\" (UniqueName: \"kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.868169 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.868207 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.868254 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.868651 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.368630686 +0000 UTC m=+145.117600922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bz6x2" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.895835 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.896035 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:51 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:51 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:51 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.896081 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976004 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976516 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t8h\" (UniqueName: \"kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976550 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976576 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976602 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976632 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.976697 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:51 crc kubenswrapper[5004]: E1203 14:08:51.976766 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 14:08:52.47675081 +0000 UTC m=+145.225721046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.977566 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:51 crc kubenswrapper[5004]: I1203 14:08:51.977875 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.021658 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.022584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t8h\" (UniqueName: \"kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h\") pod \"certified-operators-cbkcz\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.072278 5004 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.072334 5004 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.078279 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.084204 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.084252 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.152743 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.153715 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bz6x2\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.182047 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.205034 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.224518 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.284063 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" event={"ID":"f7098953-20ce-4f6d-a04e-c79d2811ecd6","Type":"ContainerStarted","Data":"1dbe633dddef700fb28d72be882877d4c9ef43e857d096a6b88c3410b82ec6f5"} Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.285481 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.295279 5004 generic.go:334] "Generic (PLEG): container finished" podID="b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" containerID="d11d97ce45bcdba61708e9322b0083ff25a7dcd5c672c72780a53313cbabbfb7" exitCode=0 Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.295454 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" event={"ID":"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f","Type":"ContainerDied","Data":"d11d97ce45bcdba61708e9322b0083ff25a7dcd5c672c72780a53313cbabbfb7"} Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.295571 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" podUID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" containerName="controller-manager" containerID="cri-o://581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5" gracePeriod=30 Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.296544 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:52 crc kubenswrapper[5004]: W1203 14:08:52.303319 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf97b6736_c178_4178_b21b_abeb67027c36.slice/crio-af1448fe11c00ba1c13ee72a28656eddd5718c1dc1d78e55f3c37c95057bc6cd WatchSource:0}: Error finding container af1448fe11c00ba1c13ee72a28656eddd5718c1dc1d78e55f3c37c95057bc6cd: Status 404 returned error can't find the container with id af1448fe11c00ba1c13ee72a28656eddd5718c1dc1d78e55f3c37c95057bc6cd Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.457526 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.604264 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.632717 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.668662 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.676296 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.818442 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:52 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:52 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:52 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.818740 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.827528 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.827581 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:08:52 crc kubenswrapper[5004]: I1203 14:08:52.979585 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.020920 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:08:53 crc kubenswrapper[5004]: E1203 14:08:53.021178 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" containerName="controller-manager" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.021195 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" containerName="controller-manager" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.021325 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" containerName="controller-manager" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.021977 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.035393 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.064598 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098012 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles\") pod \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098477 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca\") pod \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098529 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6srhg\" (UniqueName: \"kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg\") pod \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098589 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert\") pod \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098624 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config\") pod \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\" (UID: \"1b84a4a2-68c0-4b25-90c0-78e439a258a0\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098899 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tdm\" (UniqueName: \"kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098946 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b84a4a2-68c0-4b25-90c0-78e439a258a0" (UID: "1b84a4a2-68c0-4b25-90c0-78e439a258a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.098997 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.099096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.099279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.105416 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b84a4a2-68c0-4b25-90c0-78e439a258a0" (UID: "1b84a4a2-68c0-4b25-90c0-78e439a258a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.106091 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.107348 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config" (OuterVolumeSpecName: "config") pod "1b84a4a2-68c0-4b25-90c0-78e439a258a0" (UID: "1b84a4a2-68c0-4b25-90c0-78e439a258a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.115910 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg" (OuterVolumeSpecName: "kube-api-access-6srhg") pod "1b84a4a2-68c0-4b25-90c0-78e439a258a0" (UID: "1b84a4a2-68c0-4b25-90c0-78e439a258a0"). InnerVolumeSpecName "kube-api-access-6srhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.137071 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b84a4a2-68c0-4b25-90c0-78e439a258a0" (UID: "1b84a4a2-68c0-4b25-90c0-78e439a258a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.147022 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.155040 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.161644 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.165403 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207487 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207552 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207600 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207632 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207690 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207744 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgrf\" (UniqueName: \"kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207770 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8tdm\" (UniqueName: \"kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.207990 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.208008 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6srhg\" (UniqueName: \"kubernetes.io/projected/1b84a4a2-68c0-4b25-90c0-78e439a258a0-kube-api-access-6srhg\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.208022 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b84a4a2-68c0-4b25-90c0-78e439a258a0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.208032 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84a4a2-68c0-4b25-90c0-78e439a258a0-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.209276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.209662 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.213311 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.215730 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.226357 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8tdm\" (UniqueName: \"kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm\") pod \"controller-manager-879f6c89f-m7x67\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309043 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309106 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309174 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgrf\" (UniqueName: \"kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309299 5004 generic.go:334] "Generic (PLEG): container finished" podID="83f94685-023c-4305-b816-37f10184a670" containerID="1199662b4056f1b02383c3a884ca97b390ed36d893fd6781ee9bca7297680e7a" exitCode=0 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309515 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerDied","Data":"1199662b4056f1b02383c3a884ca97b390ed36d893fd6781ee9bca7297680e7a"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309555 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerStarted","Data":"aedea22d9c6051e0b25b4cd25b978091daa5dde23d8d8c16b876e3ade57fdb25"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.309783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.311886 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.316891 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" event={"ID":"f7098953-20ce-4f6d-a04e-c79d2811ecd6","Type":"ContainerStarted","Data":"ca66e10484020928d38f993181f20bbd62c6b72714f6b84664617b3820928fa0"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.322027 5004 generic.go:334] "Generic (PLEG): container finished" podID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerID="a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45" exitCode=0 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.322107 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerDied","Data":"a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.322138 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerStarted","Data":"4f76424cdddd81cd0c9e7abe8fcc95bdc3bff03f26bc36e16add8d98509f42ce"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.326295 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" containerID="581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5" exitCode=0 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.326403 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" event={"ID":"1b84a4a2-68c0-4b25-90c0-78e439a258a0","Type":"ContainerDied","Data":"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.326456 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" event={"ID":"1b84a4a2-68c0-4b25-90c0-78e439a258a0","Type":"ContainerDied","Data":"af87a1dea5a0092c35c45c4705831eadd9e97899312b7658dda6581ce3f218b2"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.326486 5004 scope.go:117] "RemoveContainer" containerID="581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.327837 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4v6z" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.331725 5004 generic.go:334] "Generic (PLEG): container finished" podID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerID="4006ef83e501072f789aef7559733b64f10435c0e031e0f8fc0392fc805330a5" exitCode=0 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.331969 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerDied","Data":"4006ef83e501072f789aef7559733b64f10435c0e031e0f8fc0392fc805330a5"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.332020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerStarted","Data":"0f747f96326600c177ed5e6f25c62e86277734de0abf3737e5be3e757fcf6f7c"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.334359 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45062487-e824-4123-b752-f86b13a9fa19","Type":"ContainerStarted","Data":"48def5e362b351a3d56ba937bfd6138279bc3321ea522299a1d2fa9af2dbe8c3"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.336736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgrf\" (UniqueName: \"kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf\") pod \"redhat-marketplace-qrsqv\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.337829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" event={"ID":"af08e33d-fe7e-48e5-a7ae-149d75ef5595","Type":"ContainerStarted","Data":"7f9db382dda0730f92a73eb1d856fe0adc6f404f16854f3b84699648143bce6c"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.342721 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.344785 5004 generic.go:334] "Generic (PLEG): container finished" podID="f97b6736-c178-4178-b21b-abeb67027c36" containerID="9cc9bf17035c964f1ce5e7e38cc9890b6d66bf00a82398714324b0f2b2b4e3c6" exitCode=0 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.344870 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerDied","Data":"9cc9bf17035c964f1ce5e7e38cc9890b6d66bf00a82398714324b0f2b2b4e3c6"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.344942 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerStarted","Data":"af1448fe11c00ba1c13ee72a28656eddd5718c1dc1d78e55f3c37c95057bc6cd"} Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.355293 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zp9d5" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.360046 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tnfw8" podStartSLOduration=12.360023334 podStartE2EDuration="12.360023334s" podCreationTimestamp="2025-12-03 14:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:53.351976578 +0000 UTC m=+146.100946834" watchObservedRunningTime="2025-12-03 14:08:53.360023334 +0000 UTC m=+146.108993580" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.377298 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.379977 5004 scope.go:117] "RemoveContainer" containerID="581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5" Dec 03 14:08:53 crc kubenswrapper[5004]: E1203 14:08:53.381981 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5\": container with ID starting with 581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5 not found: ID does not exist" containerID="581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.382155 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5"} err="failed to get container status \"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5\": rpc error: code = NotFound desc = could not find container \"581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5\": container with ID starting with 581d10f3b96aceed44cb850a7419ef8e8d5394d2ede2dc3531059fa59cbb3eb5 not found: ID does not exist" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.448611 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.453035 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4v6z"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.528018 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.529037 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.539414 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.614457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.614811 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.614906 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.614943 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.614977 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db596\" (UniqueName: \"kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.615008 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.615059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.617782 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.618816 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.619646 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.620254 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.624448 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b84a4a2-68c0-4b25-90c0-78e439a258a0" path="/var/lib/kubelet/pods/1b84a4a2-68c0-4b25-90c0-78e439a258a0/volumes" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.625238 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.631839 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.664008 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.715932 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume\") pod \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.715973 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69l9d\" (UniqueName: \"kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d\") pod \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.716008 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume\") pod \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\" (UID: \"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f\") " Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.716281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.716352 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db596\" (UniqueName: \"kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.716383 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.716992 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.717206 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.717885 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" (UID: "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.719951 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" (UID: "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.726199 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d" (OuterVolumeSpecName: "kube-api-access-69l9d") pod "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" (UID: "b0ebcf96-e3f0-4036-983c-c38f9f88ac4f"). InnerVolumeSpecName "kube-api-access-69l9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.727093 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.735946 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.739223 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db596\" (UniqueName: \"kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596\") pod \"redhat-marketplace-2fq5w\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.769357 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-5646w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.769404 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5646w" podUID="8c89b35e-1cbc-45b2-b90b-ae778d622bb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.769412 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-5646w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.769465 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5646w" podUID="8c89b35e-1cbc-45b2-b90b-ae778d622bb9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.800694 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.804314 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:53 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:53 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:53 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.804427 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.817977 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.818003 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69l9d\" (UniqueName: \"kubernetes.io/projected/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-kube-api-access-69l9d\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.818012 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:53 crc kubenswrapper[5004]: W1203 14:08:53.864131 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-37cbf78f5769c809eb36dba9fe7803f8892bdeb6420bcf8e350adaa683a94d8a WatchSource:0}: Error finding container 37cbf78f5769c809eb36dba9fe7803f8892bdeb6420bcf8e350adaa683a94d8a: Status 404 returned error can't find the container with id 37cbf78f5769c809eb36dba9fe7803f8892bdeb6420bcf8e350adaa683a94d8a Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.870177 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.873387 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.875114 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.906825 5004 patch_prober.go:28] interesting pod/apiserver-76f77b778f-d5x7k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]log ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]etcd ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/max-in-flight-filter ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 14:08:53 crc kubenswrapper[5004]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 14:08:53 crc kubenswrapper[5004]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/openshift.io-startinformers ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 14:08:53 crc kubenswrapper[5004]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 14:08:53 crc kubenswrapper[5004]: livez check failed Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.908621 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" podUID="d598385d-7b3b-4ac4-be9d-8523a0a14bd0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.926701 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:08:53 crc kubenswrapper[5004]: W1203 14:08:53.927118 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb79a1cc7_141c_4262_b5ff_df62cad6ec55.slice/crio-3cc3d425687a51aa55bcb6d3607eb5402937b2660f0c087cdb9e6e4681e97456 WatchSource:0}: Error finding container 3cc3d425687a51aa55bcb6d3607eb5402937b2660f0c087cdb9e6e4681e97456: Status 404 returned error can't find the container with id 3cc3d425687a51aa55bcb6d3607eb5402937b2660f0c087cdb9e6e4681e97456 Dec 03 14:08:53 crc kubenswrapper[5004]: W1203 14:08:53.934994 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9517257_d9b4_480b_bc6f_6424577ef33b.slice/crio-a7d793d0ec92fc434715ad94fed0e4c7bb285031fbbf3ba31bf93767d4290f08 WatchSource:0}: Error finding container a7d793d0ec92fc434715ad94fed0e4c7bb285031fbbf3ba31bf93767d4290f08: Status 404 returned error can't find the container with id a7d793d0ec92fc434715ad94fed0e4c7bb285031fbbf3ba31bf93767d4290f08 Dec 03 14:08:53 crc kubenswrapper[5004]: I1203 14:08:53.936158 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.114241 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.114777 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.120896 5004 patch_prober.go:28] interesting pod/console-f9d7485db-ll8wz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.120972 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ll8wz" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.143125 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.143453 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.146556 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.153803 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:54 crc kubenswrapper[5004]: W1203 14:08:54.154253 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e833e1_0728_440d_b3ce_ab5b89b963ef.slice/crio-421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6 WatchSource:0}: Error finding container 421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6: Status 404 returned error can't find the container with id 421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6 Dec 03 14:08:54 crc kubenswrapper[5004]: W1203 14:08:54.268503 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-a80f3b75345205767d974fa7d6a3e810aa29c8db0f3c9bac3784dfd34dd6cb28 WatchSource:0}: Error finding container a80f3b75345205767d974fa7d6a3e810aa29c8db0f3c9bac3784dfd34dd6cb28: Status 404 returned error can't find the container with id a80f3b75345205767d974fa7d6a3e810aa29c8db0f3c9bac3784dfd34dd6cb28 Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.339254 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:08:54 crc kubenswrapper[5004]: E1203 14:08:54.339486 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" containerName="collect-profiles" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.339504 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" containerName="collect-profiles" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.339612 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" containerName="collect-profiles" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.340438 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.344380 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.347980 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.375878 5004 generic.go:334] "Generic (PLEG): container finished" podID="45062487-e824-4123-b752-f86b13a9fa19" containerID="c89af68ce2b0ee2802349a64a5942840083d446bd5753efcdcaccc62690667ef" exitCode=0 Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.376309 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45062487-e824-4123-b752-f86b13a9fa19","Type":"ContainerDied","Data":"c89af68ce2b0ee2802349a64a5942840083d446bd5753efcdcaccc62690667ef"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.379231 5004 generic.go:334] "Generic (PLEG): container finished" podID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerID="1ec4d41556278b6ad5820b2ea30232ada8f95c248108f0de0f5d14a7cae3b4ed" exitCode=0 Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.379554 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerDied","Data":"1ec4d41556278b6ad5820b2ea30232ada8f95c248108f0de0f5d14a7cae3b4ed"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.379602 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerStarted","Data":"a7d793d0ec92fc434715ad94fed0e4c7bb285031fbbf3ba31bf93767d4290f08"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.398782 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" event={"ID":"af08e33d-fe7e-48e5-a7ae-149d75ef5595","Type":"ContainerStarted","Data":"0d81f8e252698daa20ad35352f41adfb2b5b1bf6cdf2664b6c718b0d0cabb97d"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.399595 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.405842 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a1d82f0384a6bfeade8d4c284ef8bae943f01c937f8b6f712303aa2686a58f66"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.411742 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" event={"ID":"b0ebcf96-e3f0-4036-983c-c38f9f88ac4f","Type":"ContainerDied","Data":"c368fe6caa51db0f317503c565fcb9367433fd1e764205cb22331504427926cf"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.411789 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c368fe6caa51db0f317503c565fcb9367433fd1e764205cb22331504427926cf" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.411880 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.417348 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" event={"ID":"b79a1cc7-141c-4262-b5ff-df62cad6ec55","Type":"ContainerStarted","Data":"c79c91bd96223965ef0974d9a92d81c9e05f370380e0e3ceb9665ade6e0cc26e"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.417399 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" event={"ID":"b79a1cc7-141c-4262-b5ff-df62cad6ec55","Type":"ContainerStarted","Data":"3cc3d425687a51aa55bcb6d3607eb5402937b2660f0c087cdb9e6e4681e97456"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.417540 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.434972 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.435034 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzjs\" (UniqueName: \"kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.435169 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.437551 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerStarted","Data":"421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.450006 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.456055 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a80f3b75345205767d974fa7d6a3e810aa29c8db0f3c9bac3784dfd34dd6cb28"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.456737 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.458901 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" podStartSLOduration=3.45888431 podStartE2EDuration="3.45888431s" podCreationTimestamp="2025-12-03 14:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:54.4561673 +0000 UTC m=+147.205137546" watchObservedRunningTime="2025-12-03 14:08:54.45888431 +0000 UTC m=+147.207854546" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.482361 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" podStartSLOduration=128.482343469 podStartE2EDuration="2m8.482343469s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:54.48135868 +0000 UTC m=+147.230328916" watchObservedRunningTime="2025-12-03 14:08:54.482343469 +0000 UTC m=+147.231313715" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.483284 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1fb67622624e8646c90be94d2823247dfe2329ee43f0c20157660413b32878ce"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.483320 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"37cbf78f5769c809eb36dba9fe7803f8892bdeb6420bcf8e350adaa683a94d8a"} Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.499738 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jgs55" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.545195 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.545325 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.545347 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzjs\" (UniqueName: \"kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.546451 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.546953 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.608479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzjs\" (UniqueName: \"kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs\") pod \"redhat-operators-9m4fr\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.680024 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.739197 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.756751 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.764490 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.811790 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:54 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:54 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:54 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.811878 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.835453 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.953252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.953890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:54 crc kubenswrapper[5004]: I1203 14:08:54.953969 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv89c\" (UniqueName: \"kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.020677 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.056194 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.056301 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.056346 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv89c\" (UniqueName: \"kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.059017 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.059264 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.087679 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv89c\" (UniqueName: \"kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c\") pod \"redhat-operators-czsxc\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.386794 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.493533 5004 generic.go:334] "Generic (PLEG): container finished" podID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerID="7c13d9fa9a1f05d3c5e42c2b815197bae9691fa30941a1df3377ccb8a6339ec6" exitCode=0 Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.493636 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerDied","Data":"7c13d9fa9a1f05d3c5e42c2b815197bae9691fa30941a1df3377ccb8a6339ec6"} Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.493667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerStarted","Data":"0119244b574cd1fdbaa37ee83b4ef46d9afac400c4f5528522516b929ca18b4e"} Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.495545 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4eca047d8c4b56d4023be5eb69ebd37a11263c673b7f4a09485c6a81e55ee6ee"} Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.497414 5004 generic.go:334] "Generic (PLEG): container finished" podID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerID="1f85a3c0e97d35e002fc86253917a42486b97be6c49fd0ce81d206abe54154b2" exitCode=0 Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.497548 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerDied","Data":"1f85a3c0e97d35e002fc86253917a42486b97be6c49fd0ce81d206abe54154b2"} Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.501175 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4723fc29ce5a8e2585762ebc89a555c2fde3753c8288c7242d8769ed08a496f0"} Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.803046 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:55 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:55 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:55 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.803104 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.809211 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.901726 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.965995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access\") pod \"45062487-e824-4123-b752-f86b13a9fa19\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.966082 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir\") pod \"45062487-e824-4123-b752-f86b13a9fa19\" (UID: \"45062487-e824-4123-b752-f86b13a9fa19\") " Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.966368 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45062487-e824-4123-b752-f86b13a9fa19" (UID: "45062487-e824-4123-b752-f86b13a9fa19"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:08:55 crc kubenswrapper[5004]: I1203 14:08:55.987202 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45062487-e824-4123-b752-f86b13a9fa19" (UID: "45062487-e824-4123-b752-f86b13a9fa19"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.067550 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45062487-e824-4123-b752-f86b13a9fa19-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.067588 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45062487-e824-4123-b752-f86b13a9fa19-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.508569 5004 generic.go:334] "Generic (PLEG): container finished" podID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerID="a9ba38ee55295b8f280d85948fd18ba2733812f2a88996f859188cef0f065720" exitCode=0 Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.508699 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerDied","Data":"a9ba38ee55295b8f280d85948fd18ba2733812f2a88996f859188cef0f065720"} Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.508735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerStarted","Data":"0b3270785c4c8e0ce5f1ec315c49dd4b7dc0c758b1e9cd233eecb2ddbd92674c"} Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.516717 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.518025 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"45062487-e824-4123-b752-f86b13a9fa19","Type":"ContainerDied","Data":"48def5e362b351a3d56ba937bfd6138279bc3321ea522299a1d2fa9af2dbe8c3"} Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.518096 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48def5e362b351a3d56ba937bfd6138279bc3321ea522299a1d2fa9af2dbe8c3" Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.803565 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:56 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:56 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:56 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:56 crc kubenswrapper[5004]: I1203 14:08:56.803632 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.030450 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:08:57 crc kubenswrapper[5004]: E1203 14:08:57.030701 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45062487-e824-4123-b752-f86b13a9fa19" containerName="pruner" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.030716 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45062487-e824-4123-b752-f86b13a9fa19" containerName="pruner" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.030844 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="45062487-e824-4123-b752-f86b13a9fa19" containerName="pruner" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.031321 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.035197 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.035294 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.046742 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.183327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.183378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.284503 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.284548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.284672 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.301598 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.360240 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.627032 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 14:08:57 crc kubenswrapper[5004]: W1203 14:08:57.639080 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e828b13_0f52_4906_a175_476f14897820.slice/crio-d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7 WatchSource:0}: Error finding container d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7: Status 404 returned error can't find the container with id d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7 Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.803294 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:57 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:57 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:57 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:57 crc kubenswrapper[5004]: I1203 14:08:57.803347 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:58 crc kubenswrapper[5004]: I1203 14:08:58.540462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e828b13-0f52-4906-a175-476f14897820","Type":"ContainerStarted","Data":"d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7"} Dec 03 14:08:58 crc kubenswrapper[5004]: I1203 14:08:58.805643 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:58 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:58 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:58 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:58 crc kubenswrapper[5004]: I1203 14:08:58.805816 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:08:58 crc kubenswrapper[5004]: I1203 14:08:58.881522 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:58 crc kubenswrapper[5004]: I1203 14:08:58.887129 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-d5x7k" Dec 03 14:08:59 crc kubenswrapper[5004]: I1203 14:08:59.338098 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-65k7d" Dec 03 14:08:59 crc kubenswrapper[5004]: I1203 14:08:59.550217 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e828b13-0f52-4906-a175-476f14897820","Type":"ContainerStarted","Data":"c12fc3aae55e25d6bec4b6c63ed2945629958dd4b62559fbf1011b2927868b6e"} Dec 03 14:08:59 crc kubenswrapper[5004]: I1203 14:08:59.565821 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.5658038359999997 podStartE2EDuration="2.565803836s" podCreationTimestamp="2025-12-03 14:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:08:59.564011153 +0000 UTC m=+152.312981389" watchObservedRunningTime="2025-12-03 14:08:59.565803836 +0000 UTC m=+152.314774072" Dec 03 14:08:59 crc kubenswrapper[5004]: I1203 14:08:59.802702 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:08:59 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:08:59 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:08:59 crc kubenswrapper[5004]: healthz check failed Dec 03 14:08:59 crc kubenswrapper[5004]: I1203 14:08:59.802999 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:09:00 crc kubenswrapper[5004]: I1203 14:09:00.568671 5004 generic.go:334] "Generic (PLEG): container finished" podID="0e828b13-0f52-4906-a175-476f14897820" containerID="c12fc3aae55e25d6bec4b6c63ed2945629958dd4b62559fbf1011b2927868b6e" exitCode=0 Dec 03 14:09:00 crc kubenswrapper[5004]: I1203 14:09:00.568723 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e828b13-0f52-4906-a175-476f14897820","Type":"ContainerDied","Data":"c12fc3aae55e25d6bec4b6c63ed2945629958dd4b62559fbf1011b2927868b6e"} Dec 03 14:09:00 crc kubenswrapper[5004]: I1203 14:09:00.802701 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:09:00 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:09:00 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:09:00 crc kubenswrapper[5004]: healthz check failed Dec 03 14:09:00 crc kubenswrapper[5004]: I1203 14:09:00.802760 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.805818 5004 patch_prober.go:28] interesting pod/router-default-5444994796-ltc99 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 14:09:01 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 03 14:09:01 crc kubenswrapper[5004]: [+]process-running ok Dec 03 14:09:01 crc kubenswrapper[5004]: healthz check failed Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.806213 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ltc99" podUID="8704c023-6680-4430-a7e7-b4aa5a76d365" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.874810 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.961591 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir\") pod \"0e828b13-0f52-4906-a175-476f14897820\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.961710 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access\") pod \"0e828b13-0f52-4906-a175-476f14897820\" (UID: \"0e828b13-0f52-4906-a175-476f14897820\") " Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.961717 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e828b13-0f52-4906-a175-476f14897820" (UID: "0e828b13-0f52-4906-a175-476f14897820"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.962085 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e828b13-0f52-4906-a175-476f14897820-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:01 crc kubenswrapper[5004]: I1203 14:09:01.990524 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e828b13-0f52-4906-a175-476f14897820" (UID: "0e828b13-0f52-4906-a175-476f14897820"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.063995 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e828b13-0f52-4906-a175-476f14897820-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.583512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e828b13-0f52-4906-a175-476f14897820","Type":"ContainerDied","Data":"d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7"} Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.583560 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d368dc75f0df8d537c91467d0f4dfcac6395a63719761c61f63a3a24de900ac7" Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.583580 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.803073 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:09:02 crc kubenswrapper[5004]: I1203 14:09:02.804958 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ltc99" Dec 03 14:09:03 crc kubenswrapper[5004]: I1203 14:09:03.775816 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5646w" Dec 03 14:09:04 crc kubenswrapper[5004]: I1203 14:09:04.118223 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:09:04 crc kubenswrapper[5004]: I1203 14:09:04.122240 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:09:08 crc kubenswrapper[5004]: I1203 14:09:08.461648 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:09:08 crc kubenswrapper[5004]: I1203 14:09:08.466795 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54394065-8262-4c2e-abdb-c81b096049ef-metrics-certs\") pod \"network-metrics-daemon-dgzr8\" (UID: \"54394065-8262-4c2e-abdb-c81b096049ef\") " pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:09:08 crc kubenswrapper[5004]: I1203 14:09:08.742009 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dgzr8" Dec 03 14:09:12 crc kubenswrapper[5004]: I1203 14:09:12.465765 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:09:22 crc kubenswrapper[5004]: I1203 14:09:22.824641 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:09:22 crc kubenswrapper[5004]: I1203 14:09:22.824983 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.771203 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.771406 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jgrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qrsqv_openshift-marketplace(e9517257-d9b4-480b-bc6f-6424577ef33b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.772614 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qrsqv" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.785554 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.785749 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-db596,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2fq5w_openshift-marketplace(90e833e1-0728-440d-b3ce-ab5b89b963ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:23 crc kubenswrapper[5004]: E1203 14:09:23.787047 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2fq5w" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" Dec 03 14:09:25 crc kubenswrapper[5004]: I1203 14:09:25.053272 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-chjwd" Dec 03 14:09:26 crc kubenswrapper[5004]: E1203 14:09:26.412639 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2fq5w" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" Dec 03 14:09:26 crc kubenswrapper[5004]: E1203 14:09:26.413422 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qrsqv" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" Dec 03 14:09:26 crc kubenswrapper[5004]: E1203 14:09:26.492701 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 14:09:26 crc kubenswrapper[5004]: E1203 14:09:26.492925 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv89c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-czsxc_openshift-marketplace(0b49f439-9668-4e11-92b3-03bac9e07f39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:26 crc kubenswrapper[5004]: E1203 14:09:26.494135 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-czsxc" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.730507 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-czsxc" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.808807 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.808976 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g9t8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cbkcz_openshift-marketplace(7beb6a23-0e72-4008-a9a4-f20d972a2500): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.810177 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cbkcz" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.822708 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.822942 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzzjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9m4fr_openshift-marketplace(ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:27 crc kubenswrapper[5004]: E1203 14:09:27.824337 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9m4fr" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.205246 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cbkcz" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.205253 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9m4fr" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.218969 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.219177 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bs2m7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l6c7p_openshift-marketplace(83f94685-023c-4305-b816-37f10184a670): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.224040 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l6c7p" podUID="83f94685-023c-4305-b816-37f10184a670" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.251099 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.251257 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhw4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lkn6v_openshift-marketplace(098255d0-cc88-4fba-bbff-b4427d1dac07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.252596 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lkn6v" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.263975 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.264114 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljpvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-f5tn4_openshift-marketplace(f97b6736-c178-4178-b21b-abeb67027c36): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.265415 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-f5tn4" podUID="f97b6736-c178-4178-b21b-abeb67027c36" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.604125 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dgzr8"] Dec 03 14:09:29 crc kubenswrapper[5004]: W1203 14:09:29.613390 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54394065_8262_4c2e_abdb_c81b096049ef.slice/crio-fe27b5028b074d9c285489cbe41981212a409d38a36d86f007931efa20abff75 WatchSource:0}: Error finding container fe27b5028b074d9c285489cbe41981212a409d38a36d86f007931efa20abff75: Status 404 returned error can't find the container with id fe27b5028b074d9c285489cbe41981212a409d38a36d86f007931efa20abff75 Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.734622 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" event={"ID":"54394065-8262-4c2e-abdb-c81b096049ef","Type":"ContainerStarted","Data":"fe27b5028b074d9c285489cbe41981212a409d38a36d86f007931efa20abff75"} Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.736313 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lkn6v" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.736376 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l6c7p" podUID="83f94685-023c-4305-b816-37f10184a670" Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.736809 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-f5tn4" podUID="f97b6736-c178-4178-b21b-abeb67027c36" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.841577 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:09:29 crc kubenswrapper[5004]: E1203 14:09:29.841842 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e828b13-0f52-4906-a175-476f14897820" containerName="pruner" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.843942 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e828b13-0f52-4906-a175-476f14897820" containerName="pruner" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.844123 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e828b13-0f52-4906-a175-476f14897820" containerName="pruner" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.844433 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.844514 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.847841 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.848373 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.972449 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:29 crc kubenswrapper[5004]: I1203 14:09:29.972793 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.074578 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.074627 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.074693 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.092932 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.168205 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.670940 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.751519 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" event={"ID":"54394065-8262-4c2e-abdb-c81b096049ef","Type":"ContainerStarted","Data":"3c7cba1fcfbf1dc3dc97cfcb51ba3d1be170f67c5c03ba859c16f2f2bcc7db35"} Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.751947 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dgzr8" event={"ID":"54394065-8262-4c2e-abdb-c81b096049ef","Type":"ContainerStarted","Data":"281cbd34ebc5b6df768fa55c9922b1eab1f89e93b97deb7e1d9e8bc5392562a4"} Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.753407 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2400c051-8d66-4228-a229-eb98ac2faa0e","Type":"ContainerStarted","Data":"212d857744d5d69f2740c382b0dcc302ca6b7094084761fe7ff09444d701e614"} Dec 03 14:09:30 crc kubenswrapper[5004]: I1203 14:09:30.776481 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dgzr8" podStartSLOduration=164.776459443 podStartE2EDuration="2m44.776459443s" podCreationTimestamp="2025-12-03 14:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:09:30.769486168 +0000 UTC m=+183.518456434" watchObservedRunningTime="2025-12-03 14:09:30.776459443 +0000 UTC m=+183.525429689" Dec 03 14:09:31 crc kubenswrapper[5004]: I1203 14:09:31.761251 5004 generic.go:334] "Generic (PLEG): container finished" podID="2400c051-8d66-4228-a229-eb98ac2faa0e" containerID="374af8c46956cf9b63bc160c29c9f91d9c03efcdc81681183245411127a0cf69" exitCode=0 Dec 03 14:09:31 crc kubenswrapper[5004]: I1203 14:09:31.761355 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2400c051-8d66-4228-a229-eb98ac2faa0e","Type":"ContainerDied","Data":"374af8c46956cf9b63bc160c29c9f91d9c03efcdc81681183245411127a0cf69"} Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.131041 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.216255 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access\") pod \"2400c051-8d66-4228-a229-eb98ac2faa0e\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.216412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir\") pod \"2400c051-8d66-4228-a229-eb98ac2faa0e\" (UID: \"2400c051-8d66-4228-a229-eb98ac2faa0e\") " Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.216534 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2400c051-8d66-4228-a229-eb98ac2faa0e" (UID: "2400c051-8d66-4228-a229-eb98ac2faa0e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.216682 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2400c051-8d66-4228-a229-eb98ac2faa0e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.221444 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2400c051-8d66-4228-a229-eb98ac2faa0e" (UID: "2400c051-8d66-4228-a229-eb98ac2faa0e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.318081 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2400c051-8d66-4228-a229-eb98ac2faa0e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.739936 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.778418 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2400c051-8d66-4228-a229-eb98ac2faa0e","Type":"ContainerDied","Data":"212d857744d5d69f2740c382b0dcc302ca6b7094084761fe7ff09444d701e614"} Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.778479 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212d857744d5d69f2740c382b0dcc302ca6b7094084761fe7ff09444d701e614" Dec 03 14:09:33 crc kubenswrapper[5004]: I1203 14:09:33.778510 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.236806 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:09:36 crc kubenswrapper[5004]: E1203 14:09:36.237326 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2400c051-8d66-4228-a229-eb98ac2faa0e" containerName="pruner" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.237337 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2400c051-8d66-4228-a229-eb98ac2faa0e" containerName="pruner" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.237434 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2400c051-8d66-4228-a229-eb98ac2faa0e" containerName="pruner" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.237867 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.241507 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.243272 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.243450 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.360764 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.360814 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.360835 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.464916 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.464970 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.465001 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.465293 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.465348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.499073 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access\") pod \"installer-9-crc\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:36 crc kubenswrapper[5004]: I1203 14:09:36.765496 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:09:37 crc kubenswrapper[5004]: I1203 14:09:37.166207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 14:09:37 crc kubenswrapper[5004]: W1203 14:09:37.190952 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5cbc28ec_8596_4137_97d8_7c0d3f19043c.slice/crio-2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f WatchSource:0}: Error finding container 2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f: Status 404 returned error can't find the container with id 2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f Dec 03 14:09:37 crc kubenswrapper[5004]: I1203 14:09:37.799290 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5cbc28ec-8596-4137-97d8-7c0d3f19043c","Type":"ContainerStarted","Data":"2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f"} Dec 03 14:09:38 crc kubenswrapper[5004]: I1203 14:09:38.805717 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5cbc28ec-8596-4137-97d8-7c0d3f19043c","Type":"ContainerStarted","Data":"c168f273699c41bfb08b77bca0ee0c49319fb136c03026f49e0da55ce79494c4"} Dec 03 14:09:38 crc kubenswrapper[5004]: I1203 14:09:38.822442 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.822418991 podStartE2EDuration="2.822418991s" podCreationTimestamp="2025-12-03 14:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:09:38.821212855 +0000 UTC m=+191.570183141" watchObservedRunningTime="2025-12-03 14:09:38.822418991 +0000 UTC m=+191.571389237" Dec 03 14:09:42 crc kubenswrapper[5004]: I1203 14:09:42.834981 5004 generic.go:334] "Generic (PLEG): container finished" podID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerID="b68d45d6514cdf50f2686e09e7fb35806868a00aeb4108c49027943d455e8511" exitCode=0 Dec 03 14:09:42 crc kubenswrapper[5004]: I1203 14:09:42.835064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerDied","Data":"b68d45d6514cdf50f2686e09e7fb35806868a00aeb4108c49027943d455e8511"} Dec 03 14:09:42 crc kubenswrapper[5004]: I1203 14:09:42.841045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerStarted","Data":"8a2cd742b3cb051fcf14ccda3b704c9f82e695d5d27c62cdaa12cbff5e341508"} Dec 03 14:09:42 crc kubenswrapper[5004]: I1203 14:09:42.870486 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.851593 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerStarted","Data":"42149d0bd11e918117b8b8af77af197a641ecb5e7c90386e57e69fe0f294047c"} Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.854519 5004 generic.go:334] "Generic (PLEG): container finished" podID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerID="8a2cd742b3cb051fcf14ccda3b704c9f82e695d5d27c62cdaa12cbff5e341508" exitCode=0 Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.854554 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerDied","Data":"8a2cd742b3cb051fcf14ccda3b704c9f82e695d5d27c62cdaa12cbff5e341508"} Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.854575 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerStarted","Data":"93a9c33f3487a8e4fb70ec9c101c323f058deb598d7dfa0b6ea1339df89e75b1"} Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.874792 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.874847 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.895389 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fq5w" podStartSLOduration=1.776915896 podStartE2EDuration="50.89536949s" podCreationTimestamp="2025-12-03 14:08:53 +0000 UTC" firstStartedPulling="2025-12-03 14:08:54.450543065 +0000 UTC m=+147.199513311" lastFinishedPulling="2025-12-03 14:09:43.568996669 +0000 UTC m=+196.317966905" observedRunningTime="2025-12-03 14:09:43.874446106 +0000 UTC m=+196.623416342" watchObservedRunningTime="2025-12-03 14:09:43.89536949 +0000 UTC m=+196.644339736" Dec 03 14:09:43 crc kubenswrapper[5004]: I1203 14:09:43.898674 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qrsqv" podStartSLOduration=1.8050315609999998 podStartE2EDuration="50.898533503s" podCreationTimestamp="2025-12-03 14:08:53 +0000 UTC" firstStartedPulling="2025-12-03 14:08:54.3934867 +0000 UTC m=+147.142456936" lastFinishedPulling="2025-12-03 14:09:43.486988642 +0000 UTC m=+196.235958878" observedRunningTime="2025-12-03 14:09:43.892955079 +0000 UTC m=+196.641925325" watchObservedRunningTime="2025-12-03 14:09:43.898533503 +0000 UTC m=+196.647503749" Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.859921 5004 generic.go:334] "Generic (PLEG): container finished" podID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerID="58844bbedea68b014ccd5fb026a667cabc2f1e2388ddbc3941e783a46e573d41" exitCode=0 Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.860015 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerDied","Data":"58844bbedea68b014ccd5fb026a667cabc2f1e2388ddbc3941e783a46e573d41"} Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.861480 5004 generic.go:334] "Generic (PLEG): container finished" podID="83f94685-023c-4305-b816-37f10184a670" containerID="8f36041cc560919b5c4a43555edf19e58e0aeaabb306517ae31c7e8ab4277053" exitCode=0 Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.861538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerDied","Data":"8f36041cc560919b5c4a43555edf19e58e0aeaabb306517ae31c7e8ab4277053"} Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.863392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerStarted","Data":"b21782556c38acc5332c98fbbb1f742e24a747f2add05c227a2771584d348efd"} Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.867774 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerStarted","Data":"402235e762e4ca58c6c55f115607b41cc72c5c2274c86760dda77fa76fe0bbe3"} Dec 03 14:09:44 crc kubenswrapper[5004]: I1203 14:09:44.929899 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2fq5w" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="registry-server" probeResult="failure" output=< Dec 03 14:09:44 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 03 14:09:44 crc kubenswrapper[5004]: > Dec 03 14:09:45 crc kubenswrapper[5004]: I1203 14:09:45.876541 5004 generic.go:334] "Generic (PLEG): container finished" podID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerID="402235e762e4ca58c6c55f115607b41cc72c5c2274c86760dda77fa76fe0bbe3" exitCode=0 Dec 03 14:09:45 crc kubenswrapper[5004]: I1203 14:09:45.876617 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerDied","Data":"402235e762e4ca58c6c55f115607b41cc72c5c2274c86760dda77fa76fe0bbe3"} Dec 03 14:09:45 crc kubenswrapper[5004]: I1203 14:09:45.880292 5004 generic.go:334] "Generic (PLEG): container finished" podID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerID="b21782556c38acc5332c98fbbb1f742e24a747f2add05c227a2771584d348efd" exitCode=0 Dec 03 14:09:45 crc kubenswrapper[5004]: I1203 14:09:45.880321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerDied","Data":"b21782556c38acc5332c98fbbb1f742e24a747f2add05c227a2771584d348efd"} Dec 03 14:09:48 crc kubenswrapper[5004]: I1203 14:09:48.897643 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerStarted","Data":"6766784ce9d042e70b4e6ffd2cfd652010879144341f3a67b2fc0265d2e9e3ad"} Dec 03 14:09:49 crc kubenswrapper[5004]: I1203 14:09:49.920953 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9m4fr" podStartSLOduration=3.665838195 podStartE2EDuration="55.920907955s" podCreationTimestamp="2025-12-03 14:08:54 +0000 UTC" firstStartedPulling="2025-12-03 14:08:55.502475052 +0000 UTC m=+148.251445288" lastFinishedPulling="2025-12-03 14:09:47.757544812 +0000 UTC m=+200.506515048" observedRunningTime="2025-12-03 14:09:49.919412492 +0000 UTC m=+202.668382728" watchObservedRunningTime="2025-12-03 14:09:49.920907955 +0000 UTC m=+202.669878181" Dec 03 14:09:50 crc kubenswrapper[5004]: I1203 14:09:50.188206 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:09:50 crc kubenswrapper[5004]: I1203 14:09:50.188444 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" podUID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" containerName="controller-manager" containerID="cri-o://c79c91bd96223965ef0974d9a92d81c9e05f370380e0e3ceb9665ade6e0cc26e" gracePeriod=30 Dec 03 14:09:50 crc kubenswrapper[5004]: I1203 14:09:50.286508 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:09:50 crc kubenswrapper[5004]: I1203 14:09:50.287107 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" podUID="d787a412-6039-41df-9007-e70b05b958a4" containerName="route-controller-manager" containerID="cri-o://b34f3a007c74814928362cc5aa1c9267288a5726c52ec6e673e1fc256362e172" gracePeriod=30 Dec 03 14:09:51 crc kubenswrapper[5004]: I1203 14:09:51.916050 5004 generic.go:334] "Generic (PLEG): container finished" podID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" containerID="c79c91bd96223965ef0974d9a92d81c9e05f370380e0e3ceb9665ade6e0cc26e" exitCode=0 Dec 03 14:09:51 crc kubenswrapper[5004]: I1203 14:09:51.916094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" event={"ID":"b79a1cc7-141c-4262-b5ff-df62cad6ec55","Type":"ContainerDied","Data":"c79c91bd96223965ef0974d9a92d81c9e05f370380e0e3ceb9665ade6e0cc26e"} Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.572683 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.600931 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:09:52 crc kubenswrapper[5004]: E1203 14:09:52.601224 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" containerName="controller-manager" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.601246 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" containerName="controller-manager" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.601372 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" containerName="controller-manager" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.601829 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.618056 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.667097 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert\") pod \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.667181 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config\") pod \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.667229 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca\") pod \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.667318 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8tdm\" (UniqueName: \"kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm\") pod \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.667346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles\") pod \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\" (UID: \"b79a1cc7-141c-4262-b5ff-df62cad6ec55\") " Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.668335 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config" (OuterVolumeSpecName: "config") pod "b79a1cc7-141c-4262-b5ff-df62cad6ec55" (UID: "b79a1cc7-141c-4262-b5ff-df62cad6ec55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.668742 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca" (OuterVolumeSpecName: "client-ca") pod "b79a1cc7-141c-4262-b5ff-df62cad6ec55" (UID: "b79a1cc7-141c-4262-b5ff-df62cad6ec55"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.672180 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b79a1cc7-141c-4262-b5ff-df62cad6ec55" (UID: "b79a1cc7-141c-4262-b5ff-df62cad6ec55"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.672724 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b79a1cc7-141c-4262-b5ff-df62cad6ec55" (UID: "b79a1cc7-141c-4262-b5ff-df62cad6ec55"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.675226 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm" (OuterVolumeSpecName: "kube-api-access-x8tdm") pod "b79a1cc7-141c-4262-b5ff-df62cad6ec55" (UID: "b79a1cc7-141c-4262-b5ff-df62cad6ec55"). InnerVolumeSpecName "kube-api-access-x8tdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769141 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769206 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769247 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrxw\" (UniqueName: \"kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769354 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769403 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8tdm\" (UniqueName: \"kubernetes.io/projected/b79a1cc7-141c-4262-b5ff-df62cad6ec55-kube-api-access-x8tdm\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769417 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769431 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b79a1cc7-141c-4262-b5ff-df62cad6ec55-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769445 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.769457 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b79a1cc7-141c-4262-b5ff-df62cad6ec55-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.824126 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.824183 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.824227 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.824775 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.824820 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667" gracePeriod=600 Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.870946 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.870996 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.871034 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.871138 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrxw\" (UniqueName: \"kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.871184 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.872348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.872639 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.872751 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.883041 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.889438 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrxw\" (UniqueName: \"kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw\") pod \"controller-manager-56ccbbf646-xjgct\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.923293 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.928693 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.928784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m7x67" event={"ID":"b79a1cc7-141c-4262-b5ff-df62cad6ec55","Type":"ContainerDied","Data":"3cc3d425687a51aa55bcb6d3607eb5402937b2660f0c087cdb9e6e4681e97456"} Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.928847 5004 scope.go:117] "RemoveContainer" containerID="c79c91bd96223965ef0974d9a92d81c9e05f370380e0e3ceb9665ade6e0cc26e" Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.939227 5004 generic.go:334] "Generic (PLEG): container finished" podID="d787a412-6039-41df-9007-e70b05b958a4" containerID="b34f3a007c74814928362cc5aa1c9267288a5726c52ec6e673e1fc256362e172" exitCode=0 Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.939278 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" event={"ID":"d787a412-6039-41df-9007-e70b05b958a4","Type":"ContainerDied","Data":"b34f3a007c74814928362cc5aa1c9267288a5726c52ec6e673e1fc256362e172"} Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.961129 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:09:52 crc kubenswrapper[5004]: I1203 14:09:52.963653 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m7x67"] Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.024918 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.174454 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert\") pod \"d787a412-6039-41df-9007-e70b05b958a4\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.174895 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca\") pod \"d787a412-6039-41df-9007-e70b05b958a4\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.174953 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config\") pod \"d787a412-6039-41df-9007-e70b05b958a4\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.174993 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npxcc\" (UniqueName: \"kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc\") pod \"d787a412-6039-41df-9007-e70b05b958a4\" (UID: \"d787a412-6039-41df-9007-e70b05b958a4\") " Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.178611 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca" (OuterVolumeSpecName: "client-ca") pod "d787a412-6039-41df-9007-e70b05b958a4" (UID: "d787a412-6039-41df-9007-e70b05b958a4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.178657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config" (OuterVolumeSpecName: "config") pod "d787a412-6039-41df-9007-e70b05b958a4" (UID: "d787a412-6039-41df-9007-e70b05b958a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.179035 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d787a412-6039-41df-9007-e70b05b958a4" (UID: "d787a412-6039-41df-9007-e70b05b958a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.179220 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc" (OuterVolumeSpecName: "kube-api-access-npxcc") pod "d787a412-6039-41df-9007-e70b05b958a4" (UID: "d787a412-6039-41df-9007-e70b05b958a4"). InnerVolumeSpecName "kube-api-access-npxcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.276210 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d787a412-6039-41df-9007-e70b05b958a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.276250 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.276263 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d787a412-6039-41df-9007-e70b05b958a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.276277 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npxcc\" (UniqueName: \"kubernetes.io/projected/d787a412-6039-41df-9007-e70b05b958a4-kube-api-access-npxcc\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.377820 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.377945 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.418299 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.498690 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:09:53 crc kubenswrapper[5004]: W1203 14:09:53.500198 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45de6168_4622_4916_84f1_c0d15f570daf.slice/crio-a86ad740c9e23ec67a067bf845ec33912ab18f24cfe453714c8817682d8c231f WatchSource:0}: Error finding container a86ad740c9e23ec67a067bf845ec33912ab18f24cfe453714c8817682d8c231f: Status 404 returned error can't find the container with id a86ad740c9e23ec67a067bf845ec33912ab18f24cfe453714c8817682d8c231f Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.618952 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79a1cc7-141c-4262-b5ff-df62cad6ec55" path="/var/lib/kubelet/pods/b79a1cc7-141c-4262-b5ff-df62cad6ec55/volumes" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.914966 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.950328 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667" exitCode=0 Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.950431 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667"} Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.952293 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" event={"ID":"d787a412-6039-41df-9007-e70b05b958a4","Type":"ContainerDied","Data":"26a58b0863fbb2f2a03e920d067dde474573d75bfb8e1639fa7ef839d5d3e42c"} Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.952302 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.952344 5004 scope.go:117] "RemoveContainer" containerID="b34f3a007c74814928362cc5aa1c9267288a5726c52ec6e673e1fc256362e172" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.954495 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.962644 5004 generic.go:334] "Generic (PLEG): container finished" podID="f97b6736-c178-4178-b21b-abeb67027c36" containerID="619051eb2811045d647378ee3b51956eccfbf3c2d9794ec25886a2d8e2ce8c62" exitCode=0 Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.962739 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerDied","Data":"619051eb2811045d647378ee3b51956eccfbf3c2d9794ec25886a2d8e2ce8c62"} Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.963907 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" event={"ID":"45de6168-4622-4916-84f1-c0d15f570daf","Type":"ContainerStarted","Data":"a86ad740c9e23ec67a067bf845ec33912ab18f24cfe453714c8817682d8c231f"} Dec 03 14:09:53 crc kubenswrapper[5004]: I1203 14:09:53.966395 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerStarted","Data":"732530e240ad9bc753060ca6f5126195d4c1d7ed67442aa0f480b1682e563728"} Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.006297 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l6c7p" podStartSLOduration=3.579969852 podStartE2EDuration="1m3.006279059s" podCreationTimestamp="2025-12-03 14:08:51 +0000 UTC" firstStartedPulling="2025-12-03 14:08:53.311447908 +0000 UTC m=+146.060418154" lastFinishedPulling="2025-12-03 14:09:52.737757125 +0000 UTC m=+205.486727361" observedRunningTime="2025-12-03 14:09:54.001397238 +0000 UTC m=+206.750367484" watchObservedRunningTime="2025-12-03 14:09:54.006279059 +0000 UTC m=+206.755249295" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.008388 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.017076 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.021980 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mjz2z"] Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.685763 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.686229 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.741704 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.974147 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerStarted","Data":"3657455d95b821f0a6a52c3f730ecfdc12bbc4b673c43b50fdff9ddec97683f8"} Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.976296 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" event={"ID":"45de6168-4622-4916-84f1-c0d15f570daf","Type":"ContainerStarted","Data":"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413"} Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.976926 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.978377 5004 generic.go:334] "Generic (PLEG): container finished" podID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerID="9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8" exitCode=0 Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.978447 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerDied","Data":"9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8"} Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.981158 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46"} Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.982529 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:09:54 crc kubenswrapper[5004]: I1203 14:09:54.984871 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerStarted","Data":"25f45ca38031fb20e09355700370117aab25b04985df8f33f15ad51c3cdebe00"} Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.005493 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cbkcz" podStartSLOduration=2.909874057 podStartE2EDuration="1m4.005473201s" podCreationTimestamp="2025-12-03 14:08:51 +0000 UTC" firstStartedPulling="2025-12-03 14:08:53.333480585 +0000 UTC m=+146.082450831" lastFinishedPulling="2025-12-03 14:09:54.429079739 +0000 UTC m=+207.178049975" observedRunningTime="2025-12-03 14:09:55.004828902 +0000 UTC m=+207.753799138" watchObservedRunningTime="2025-12-03 14:09:55.005473201 +0000 UTC m=+207.754443437" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.035590 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.047566 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" podStartSLOduration=5.047546545 podStartE2EDuration="5.047546545s" podCreationTimestamp="2025-12-03 14:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:09:55.043696454 +0000 UTC m=+207.792666690" watchObservedRunningTime="2025-12-03 14:09:55.047546545 +0000 UTC m=+207.796516781" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.127404 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-czsxc" podStartSLOduration=3.206279709 podStartE2EDuration="1m1.127388728s" podCreationTimestamp="2025-12-03 14:08:54 +0000 UTC" firstStartedPulling="2025-12-03 14:08:56.511890823 +0000 UTC m=+149.260861059" lastFinishedPulling="2025-12-03 14:09:54.432999802 +0000 UTC m=+207.181970078" observedRunningTime="2025-12-03 14:09:55.123036553 +0000 UTC m=+207.872006789" watchObservedRunningTime="2025-12-03 14:09:55.127388728 +0000 UTC m=+207.876358975" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.171109 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:09:55 crc kubenswrapper[5004]: E1203 14:09:55.171314 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d787a412-6039-41df-9007-e70b05b958a4" containerName="route-controller-manager" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.171326 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d787a412-6039-41df-9007-e70b05b958a4" containerName="route-controller-manager" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.171431 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d787a412-6039-41df-9007-e70b05b958a4" containerName="route-controller-manager" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.171783 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.175394 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.175707 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.175939 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.176108 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.176791 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.177675 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.181572 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.301329 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dr78\" (UniqueName: \"kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.301382 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.301410 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.301440 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.387705 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.387785 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.402821 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dr78\" (UniqueName: \"kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.402885 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.402911 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.402942 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.403783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.404230 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.411124 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.420042 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dr78\" (UniqueName: \"kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78\") pod \"route-controller-manager-5f95f9d96b-ltfcn\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.492045 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.652508 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d787a412-6039-41df-9007-e70b05b958a4" path="/var/lib/kubelet/pods/d787a412-6039-41df-9007-e70b05b958a4/volumes" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.653411 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.654046 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fq5w" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="registry-server" containerID="cri-o://42149d0bd11e918117b8b8af77af197a641ecb5e7c90386e57e69fe0f294047c" gracePeriod=2 Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.976610 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.994424 5004 generic.go:334] "Generic (PLEG): container finished" podID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerID="42149d0bd11e918117b8b8af77af197a641ecb5e7c90386e57e69fe0f294047c" exitCode=0 Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.994508 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerDied","Data":"42149d0bd11e918117b8b8af77af197a641ecb5e7c90386e57e69fe0f294047c"} Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.994549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fq5w" event={"ID":"90e833e1-0728-440d-b3ce-ab5b89b963ef","Type":"ContainerDied","Data":"421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6"} Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.994563 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421ee7378b22f323d1525aa9166eabe5b660ca1c98a5a63077fab55d59159aa6" Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.996644 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerStarted","Data":"39c0fb8adab88fb69def818b6a964b82e3aa8314037fa7f20b937465db21df33"} Dec 03 14:09:55 crc kubenswrapper[5004]: I1203 14:09:55.998806 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerStarted","Data":"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173"} Dec 03 14:09:56 crc kubenswrapper[5004]: W1203 14:09:56.003731 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6896529a_3a6e_4ed1_95ef_69f76b042585.slice/crio-0882cf5f1678d5033e133195b71728370bde88feeefe135fe527f6e7563b9964 WatchSource:0}: Error finding container 0882cf5f1678d5033e133195b71728370bde88feeefe135fe527f6e7563b9964: Status 404 returned error can't find the container with id 0882cf5f1678d5033e133195b71728370bde88feeefe135fe527f6e7563b9964 Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.013068 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f5tn4" podStartSLOduration=3.019465112 podStartE2EDuration="1m5.013051093s" podCreationTimestamp="2025-12-03 14:08:51 +0000 UTC" firstStartedPulling="2025-12-03 14:08:53.362673372 +0000 UTC m=+146.111643608" lastFinishedPulling="2025-12-03 14:09:55.356259333 +0000 UTC m=+208.105229589" observedRunningTime="2025-12-03 14:09:56.013034403 +0000 UTC m=+208.762004669" watchObservedRunningTime="2025-12-03 14:09:56.013051093 +0000 UTC m=+208.762021329" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.031683 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lkn6v" podStartSLOduration=2.983105544 podStartE2EDuration="1m5.031664421s" podCreationTimestamp="2025-12-03 14:08:51 +0000 UTC" firstStartedPulling="2025-12-03 14:08:53.3237832 +0000 UTC m=+146.072753436" lastFinishedPulling="2025-12-03 14:09:55.372342077 +0000 UTC m=+208.121312313" observedRunningTime="2025-12-03 14:09:56.030999841 +0000 UTC m=+208.779970077" watchObservedRunningTime="2025-12-03 14:09:56.031664421 +0000 UTC m=+208.780634657" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.049351 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.134346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content\") pod \"90e833e1-0728-440d-b3ce-ab5b89b963ef\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.134702 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db596\" (UniqueName: \"kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596\") pod \"90e833e1-0728-440d-b3ce-ab5b89b963ef\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.134874 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities\") pod \"90e833e1-0728-440d-b3ce-ab5b89b963ef\" (UID: \"90e833e1-0728-440d-b3ce-ab5b89b963ef\") " Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.136560 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities" (OuterVolumeSpecName: "utilities") pod "90e833e1-0728-440d-b3ce-ab5b89b963ef" (UID: "90e833e1-0728-440d-b3ce-ab5b89b963ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.144278 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596" (OuterVolumeSpecName: "kube-api-access-db596") pod "90e833e1-0728-440d-b3ce-ab5b89b963ef" (UID: "90e833e1-0728-440d-b3ce-ab5b89b963ef"). InnerVolumeSpecName "kube-api-access-db596". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.162305 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90e833e1-0728-440d-b3ce-ab5b89b963ef" (UID: "90e833e1-0728-440d-b3ce-ab5b89b963ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.236532 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.236569 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db596\" (UniqueName: \"kubernetes.io/projected/90e833e1-0728-440d-b3ce-ab5b89b963ef-kube-api-access-db596\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.236580 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e833e1-0728-440d-b3ce-ab5b89b963ef-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:09:56 crc kubenswrapper[5004]: I1203 14:09:56.439992 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-czsxc" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="registry-server" probeResult="failure" output=< Dec 03 14:09:56 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 03 14:09:56 crc kubenswrapper[5004]: > Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.005783 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" event={"ID":"6896529a-3a6e-4ed1-95ef-69f76b042585","Type":"ContainerStarted","Data":"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a"} Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.005851 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" event={"ID":"6896529a-3a6e-4ed1-95ef-69f76b042585","Type":"ContainerStarted","Data":"0882cf5f1678d5033e133195b71728370bde88feeefe135fe527f6e7563b9964"} Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.005876 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fq5w" Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.007127 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.019015 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.027462 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" podStartSLOduration=7.027435024 podStartE2EDuration="7.027435024s" podCreationTimestamp="2025-12-03 14:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:09:57.024636953 +0000 UTC m=+209.773607189" watchObservedRunningTime="2025-12-03 14:09:57.027435024 +0000 UTC m=+209.776405260" Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.039321 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.045684 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fq5w"] Dec 03 14:09:57 crc kubenswrapper[5004]: I1203 14:09:57.620035 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" path="/var/lib/kubelet/pods/90e833e1-0728-440d-b3ce-ab5b89b963ef/volumes" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.477045 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.477448 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.525294 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.717267 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.717510 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.756132 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.867378 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.867439 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:01 crc kubenswrapper[5004]: I1203 14:10:01.920584 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.077305 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.078533 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.091928 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.206307 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.206382 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:02 crc kubenswrapper[5004]: I1203 14:10:02.251358 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:03 crc kubenswrapper[5004]: I1203 14:10:03.092039 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:04 crc kubenswrapper[5004]: I1203 14:10:04.053331 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:10:04 crc kubenswrapper[5004]: I1203 14:10:04.054344 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l6c7p" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="registry-server" containerID="cri-o://732530e240ad9bc753060ca6f5126195d4c1d7ed67442aa0f480b1682e563728" gracePeriod=2 Dec 03 14:10:04 crc kubenswrapper[5004]: I1203 14:10:04.251997 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:10:05 crc kubenswrapper[5004]: I1203 14:10:05.060137 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cbkcz" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="registry-server" containerID="cri-o://3657455d95b821f0a6a52c3f730ecfdc12bbc4b673c43b50fdff9ddec97683f8" gracePeriod=2 Dec 03 14:10:05 crc kubenswrapper[5004]: I1203 14:10:05.437725 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:10:05 crc kubenswrapper[5004]: I1203 14:10:05.482487 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.072196 5004 generic.go:334] "Generic (PLEG): container finished" podID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerID="3657455d95b821f0a6a52c3f730ecfdc12bbc4b673c43b50fdff9ddec97683f8" exitCode=0 Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.072319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerDied","Data":"3657455d95b821f0a6a52c3f730ecfdc12bbc4b673c43b50fdff9ddec97683f8"} Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.074979 5004 generic.go:334] "Generic (PLEG): container finished" podID="83f94685-023c-4305-b816-37f10184a670" containerID="732530e240ad9bc753060ca6f5126195d4c1d7ed67442aa0f480b1682e563728" exitCode=0 Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.075005 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerDied","Data":"732530e240ad9bc753060ca6f5126195d4c1d7ed67442aa0f480b1682e563728"} Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.556166 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.692258 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.696515 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities\") pod \"7beb6a23-0e72-4008-a9a4-f20d972a2500\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.696550 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t8h\" (UniqueName: \"kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h\") pod \"7beb6a23-0e72-4008-a9a4-f20d972a2500\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.696615 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content\") pod \"7beb6a23-0e72-4008-a9a4-f20d972a2500\" (UID: \"7beb6a23-0e72-4008-a9a4-f20d972a2500\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.697743 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities" (OuterVolumeSpecName: "utilities") pod "7beb6a23-0e72-4008-a9a4-f20d972a2500" (UID: "7beb6a23-0e72-4008-a9a4-f20d972a2500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.702692 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h" (OuterVolumeSpecName: "kube-api-access-g9t8h") pod "7beb6a23-0e72-4008-a9a4-f20d972a2500" (UID: "7beb6a23-0e72-4008-a9a4-f20d972a2500"). InnerVolumeSpecName "kube-api-access-g9t8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.755963 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7beb6a23-0e72-4008-a9a4-f20d972a2500" (UID: "7beb6a23-0e72-4008-a9a4-f20d972a2500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.797640 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs2m7\" (UniqueName: \"kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7\") pod \"83f94685-023c-4305-b816-37f10184a670\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.797785 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities\") pod \"83f94685-023c-4305-b816-37f10184a670\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.797877 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content\") pod \"83f94685-023c-4305-b816-37f10184a670\" (UID: \"83f94685-023c-4305-b816-37f10184a670\") " Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.798446 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.798560 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9t8h\" (UniqueName: \"kubernetes.io/projected/7beb6a23-0e72-4008-a9a4-f20d972a2500-kube-api-access-g9t8h\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.798576 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7beb6a23-0e72-4008-a9a4-f20d972a2500-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.798828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities" (OuterVolumeSpecName: "utilities") pod "83f94685-023c-4305-b816-37f10184a670" (UID: "83f94685-023c-4305-b816-37f10184a670"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.802436 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7" (OuterVolumeSpecName: "kube-api-access-bs2m7") pod "83f94685-023c-4305-b816-37f10184a670" (UID: "83f94685-023c-4305-b816-37f10184a670"). InnerVolumeSpecName "kube-api-access-bs2m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.856264 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83f94685-023c-4305-b816-37f10184a670" (UID: "83f94685-023c-4305-b816-37f10184a670"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.899351 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs2m7\" (UniqueName: \"kubernetes.io/projected/83f94685-023c-4305-b816-37f10184a670-kube-api-access-bs2m7\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.899385 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.899395 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f94685-023c-4305-b816-37f10184a670-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:07 crc kubenswrapper[5004]: I1203 14:10:07.902463 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerName="oauth-openshift" containerID="cri-o://f4f2b43c9ed58b197f8307b302b350011e53da3de9a54ab046a33ae53c012665" gracePeriod=15 Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.092127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6c7p" event={"ID":"83f94685-023c-4305-b816-37f10184a670","Type":"ContainerDied","Data":"aedea22d9c6051e0b25b4cd25b978091daa5dde23d8d8c16b876e3ade57fdb25"} Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.092169 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6c7p" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.092178 5004 scope.go:117] "RemoveContainer" containerID="732530e240ad9bc753060ca6f5126195d4c1d7ed67442aa0f480b1682e563728" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.095389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbkcz" event={"ID":"7beb6a23-0e72-4008-a9a4-f20d972a2500","Type":"ContainerDied","Data":"0f747f96326600c177ed5e6f25c62e86277734de0abf3737e5be3e757fcf6f7c"} Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.095413 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbkcz" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.108248 5004 generic.go:334] "Generic (PLEG): container finished" podID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerID="f4f2b43c9ed58b197f8307b302b350011e53da3de9a54ab046a33ae53c012665" exitCode=0 Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.108287 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" event={"ID":"f13001d1-8878-499b-87c3-7730c30b1a5c","Type":"ContainerDied","Data":"f4f2b43c9ed58b197f8307b302b350011e53da3de9a54ab046a33ae53c012665"} Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.110225 5004 scope.go:117] "RemoveContainer" containerID="8f36041cc560919b5c4a43555edf19e58e0aeaabb306517ae31c7e8ab4277053" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.116494 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.123121 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l6c7p"] Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.132597 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.135568 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cbkcz"] Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.149396 5004 scope.go:117] "RemoveContainer" containerID="1199662b4056f1b02383c3a884ca97b390ed36d893fd6781ee9bca7297680e7a" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.163014 5004 scope.go:117] "RemoveContainer" containerID="3657455d95b821f0a6a52c3f730ecfdc12bbc4b673c43b50fdff9ddec97683f8" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.183594 5004 scope.go:117] "RemoveContainer" containerID="402235e762e4ca58c6c55f115607b41cc72c5c2274c86760dda77fa76fe0bbe3" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.197311 5004 scope.go:117] "RemoveContainer" containerID="4006ef83e501072f789aef7559733b64f10435c0e031e0f8fc0392fc805330a5" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.295342 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405319 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svkd4\" (UniqueName: \"kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405437 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405487 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405529 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405566 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405603 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405656 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405696 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405741 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405924 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.405966 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.406036 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.406073 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session\") pod \"f13001d1-8878-499b-87c3-7730c30b1a5c\" (UID: \"f13001d1-8878-499b-87c3-7730c30b1a5c\") " Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.406370 5004 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.406760 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.407177 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.407636 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.407708 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.409637 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4" (OuterVolumeSpecName: "kube-api-access-svkd4") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "kube-api-access-svkd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.410135 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.410435 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.410695 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.411351 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.412051 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.412364 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.412526 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.413250 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f13001d1-8878-499b-87c3-7730c30b1a5c" (UID: "f13001d1-8878-499b-87c3-7730c30b1a5c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507749 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svkd4\" (UniqueName: \"kubernetes.io/projected/f13001d1-8878-499b-87c3-7730c30b1a5c-kube-api-access-svkd4\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507790 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507807 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507818 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507828 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507838 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507847 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507872 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507886 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507900 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507912 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507925 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.507936 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f13001d1-8878-499b-87c3-7730c30b1a5c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.846543 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:10:08 crc kubenswrapper[5004]: I1203 14:10:08.846772 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-czsxc" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="registry-server" containerID="cri-o://25f45ca38031fb20e09355700370117aab25b04985df8f33f15ad51c3cdebe00" gracePeriod=2 Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.116102 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" event={"ID":"f13001d1-8878-499b-87c3-7730c30b1a5c","Type":"ContainerDied","Data":"1d41724f14524304ef28ba6cf646bc31b73b38d0aef9d5edc03d04ebb4602224"} Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.116154 5004 scope.go:117] "RemoveContainer" containerID="f4f2b43c9ed58b197f8307b302b350011e53da3de9a54ab046a33ae53c012665" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.116235 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vxwzk" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.127710 5004 generic.go:334] "Generic (PLEG): container finished" podID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerID="25f45ca38031fb20e09355700370117aab25b04985df8f33f15ad51c3cdebe00" exitCode=0 Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.127764 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerDied","Data":"25f45ca38031fb20e09355700370117aab25b04985df8f33f15ad51c3cdebe00"} Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.150900 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.154832 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vxwzk"] Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.311028 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.415988 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities\") pod \"0b49f439-9668-4e11-92b3-03bac9e07f39\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.416085 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv89c\" (UniqueName: \"kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c\") pod \"0b49f439-9668-4e11-92b3-03bac9e07f39\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.416753 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content\") pod \"0b49f439-9668-4e11-92b3-03bac9e07f39\" (UID: \"0b49f439-9668-4e11-92b3-03bac9e07f39\") " Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.416767 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities" (OuterVolumeSpecName: "utilities") pod "0b49f439-9668-4e11-92b3-03bac9e07f39" (UID: "0b49f439-9668-4e11-92b3-03bac9e07f39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.416991 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.419157 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c" (OuterVolumeSpecName: "kube-api-access-mv89c") pod "0b49f439-9668-4e11-92b3-03bac9e07f39" (UID: "0b49f439-9668-4e11-92b3-03bac9e07f39"). InnerVolumeSpecName "kube-api-access-mv89c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.517587 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv89c\" (UniqueName: \"kubernetes.io/projected/0b49f439-9668-4e11-92b3-03bac9e07f39-kube-api-access-mv89c\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.547940 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b49f439-9668-4e11-92b3-03bac9e07f39" (UID: "0b49f439-9668-4e11-92b3-03bac9e07f39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.618190 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b49f439-9668-4e11-92b3-03bac9e07f39-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.619280 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" path="/var/lib/kubelet/pods/7beb6a23-0e72-4008-a9a4-f20d972a2500/volumes" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.619996 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f94685-023c-4305-b816-37f10184a670" path="/var/lib/kubelet/pods/83f94685-023c-4305-b816-37f10184a670/volumes" Dec 03 14:10:09 crc kubenswrapper[5004]: I1203 14:10:09.621107 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" path="/var/lib/kubelet/pods/f13001d1-8878-499b-87c3-7730c30b1a5c/volumes" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.138087 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czsxc" event={"ID":"0b49f439-9668-4e11-92b3-03bac9e07f39","Type":"ContainerDied","Data":"0b3270785c4c8e0ce5f1ec315c49dd4b7dc0c758b1e9cd233eecb2ddbd92674c"} Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.138159 5004 scope.go:117] "RemoveContainer" containerID="25f45ca38031fb20e09355700370117aab25b04985df8f33f15ad51c3cdebe00" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.138354 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czsxc" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.160881 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.162640 5004 scope.go:117] "RemoveContainer" containerID="b21782556c38acc5332c98fbbb1f742e24a747f2add05c227a2771584d348efd" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.169076 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-czsxc"] Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.186022 5004 scope.go:117] "RemoveContainer" containerID="a9ba38ee55295b8f280d85948fd18ba2733812f2a88996f859188cef0f065720" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.202550 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.202815 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" podUID="45de6168-4622-4916-84f1-c0d15f570daf" containerName="controller-manager" containerID="cri-o://da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413" gracePeriod=30 Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.231513 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.231835 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" podUID="6896529a-3a6e-4ed1-95ef-69f76b042585" containerName="route-controller-manager" containerID="cri-o://4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a" gracePeriod=30 Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.683186 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.748443 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.832334 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dr78\" (UniqueName: \"kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78\") pod \"6896529a-3a6e-4ed1-95ef-69f76b042585\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.832415 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca\") pod \"6896529a-3a6e-4ed1-95ef-69f76b042585\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.832437 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config\") pod \"6896529a-3a6e-4ed1-95ef-69f76b042585\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.832481 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert\") pod \"6896529a-3a6e-4ed1-95ef-69f76b042585\" (UID: \"6896529a-3a6e-4ed1-95ef-69f76b042585\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.833479 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca" (OuterVolumeSpecName: "client-ca") pod "6896529a-3a6e-4ed1-95ef-69f76b042585" (UID: "6896529a-3a6e-4ed1-95ef-69f76b042585"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.833563 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config" (OuterVolumeSpecName: "config") pod "6896529a-3a6e-4ed1-95ef-69f76b042585" (UID: "6896529a-3a6e-4ed1-95ef-69f76b042585"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.837054 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6896529a-3a6e-4ed1-95ef-69f76b042585" (UID: "6896529a-3a6e-4ed1-95ef-69f76b042585"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.837146 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78" (OuterVolumeSpecName: "kube-api-access-8dr78") pod "6896529a-3a6e-4ed1-95ef-69f76b042585" (UID: "6896529a-3a6e-4ed1-95ef-69f76b042585"). InnerVolumeSpecName "kube-api-access-8dr78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933600 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config\") pod \"45de6168-4622-4916-84f1-c0d15f570daf\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles\") pod \"45de6168-4622-4916-84f1-c0d15f570daf\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933706 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca\") pod \"45de6168-4622-4916-84f1-c0d15f570daf\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933730 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrxw\" (UniqueName: \"kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw\") pod \"45de6168-4622-4916-84f1-c0d15f570daf\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933760 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert\") pod \"45de6168-4622-4916-84f1-c0d15f570daf\" (UID: \"45de6168-4622-4916-84f1-c0d15f570daf\") " Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933956 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dr78\" (UniqueName: \"kubernetes.io/projected/6896529a-3a6e-4ed1-95ef-69f76b042585-kube-api-access-8dr78\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933967 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933975 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6896529a-3a6e-4ed1-95ef-69f76b042585-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.933985 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6896529a-3a6e-4ed1-95ef-69f76b042585-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.934612 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca" (OuterVolumeSpecName: "client-ca") pod "45de6168-4622-4916-84f1-c0d15f570daf" (UID: "45de6168-4622-4916-84f1-c0d15f570daf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.934634 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config" (OuterVolumeSpecName: "config") pod "45de6168-4622-4916-84f1-c0d15f570daf" (UID: "45de6168-4622-4916-84f1-c0d15f570daf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.934698 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "45de6168-4622-4916-84f1-c0d15f570daf" (UID: "45de6168-4622-4916-84f1-c0d15f570daf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.937005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "45de6168-4622-4916-84f1-c0d15f570daf" (UID: "45de6168-4622-4916-84f1-c0d15f570daf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:10:10 crc kubenswrapper[5004]: I1203 14:10:10.937100 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw" (OuterVolumeSpecName: "kube-api-access-ldrxw") pod "45de6168-4622-4916-84f1-c0d15f570daf" (UID: "45de6168-4622-4916-84f1-c0d15f570daf"). InnerVolumeSpecName "kube-api-access-ldrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.034836 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.034882 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldrxw\" (UniqueName: \"kubernetes.io/projected/45de6168-4622-4916-84f1-c0d15f570daf-kube-api-access-ldrxw\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.034895 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45de6168-4622-4916-84f1-c0d15f570daf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.034904 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.034912 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45de6168-4622-4916-84f1-c0d15f570daf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.150439 5004 generic.go:334] "Generic (PLEG): container finished" podID="6896529a-3a6e-4ed1-95ef-69f76b042585" containerID="4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a" exitCode=0 Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.150496 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.150524 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" event={"ID":"6896529a-3a6e-4ed1-95ef-69f76b042585","Type":"ContainerDied","Data":"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a"} Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.150567 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn" event={"ID":"6896529a-3a6e-4ed1-95ef-69f76b042585","Type":"ContainerDied","Data":"0882cf5f1678d5033e133195b71728370bde88feeefe135fe527f6e7563b9964"} Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.150586 5004 scope.go:117] "RemoveContainer" containerID="4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.155176 5004 generic.go:334] "Generic (PLEG): container finished" podID="45de6168-4622-4916-84f1-c0d15f570daf" containerID="da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413" exitCode=0 Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.155224 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" event={"ID":"45de6168-4622-4916-84f1-c0d15f570daf","Type":"ContainerDied","Data":"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413"} Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.155268 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" event={"ID":"45de6168-4622-4916-84f1-c0d15f570daf","Type":"ContainerDied","Data":"a86ad740c9e23ec67a067bf845ec33912ab18f24cfe453714c8817682d8c231f"} Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.155329 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56ccbbf646-xjgct" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.168636 5004 scope.go:117] "RemoveContainer" containerID="4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a" Dec 03 14:10:11 crc kubenswrapper[5004]: E1203 14:10:11.171145 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a\": container with ID starting with 4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a not found: ID does not exist" containerID="4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.171194 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a"} err="failed to get container status \"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a\": rpc error: code = NotFound desc = could not find container \"4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a\": container with ID starting with 4c05ab8770577846dd8230475d973a4dcc4f32ec8a58fd39da5dab979b48fe3a not found: ID does not exist" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.171218 5004 scope.go:117] "RemoveContainer" containerID="da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.197658 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.201120 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f95f9d96b-ltfcn"] Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.205888 5004 scope.go:117] "RemoveContainer" containerID="da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413" Dec 03 14:10:11 crc kubenswrapper[5004]: E1203 14:10:11.206948 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413\": container with ID starting with da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413 not found: ID does not exist" containerID="da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.206998 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413"} err="failed to get container status \"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413\": rpc error: code = NotFound desc = could not find container \"da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413\": container with ID starting with da638db49d29310b76f015ba929fb9aeac0f1bfff841efde57386529ec7e5413 not found: ID does not exist" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.213717 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.224523 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56ccbbf646-xjgct"] Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.618365 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" path="/var/lib/kubelet/pods/0b49f439-9668-4e11-92b3-03bac9e07f39/volumes" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.619103 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45de6168-4622-4916-84f1-c0d15f570daf" path="/var/lib/kubelet/pods/45de6168-4622-4916-84f1-c0d15f570daf/volumes" Dec 03 14:10:11 crc kubenswrapper[5004]: I1203 14:10:11.619592 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6896529a-3a6e-4ed1-95ef-69f76b042585" path="/var/lib/kubelet/pods/6896529a-3a6e-4ed1-95ef-69f76b042585/volumes" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198279 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8665c5968f-l4682"] Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198634 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198652 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198669 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198675 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198684 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerName="oauth-openshift" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198692 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerName="oauth-openshift" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198705 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198713 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198724 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198731 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198740 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198747 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198758 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198764 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198776 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198782 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198793 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45de6168-4622-4916-84f1-c0d15f570daf" containerName="controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198800 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45de6168-4622-4916-84f1-c0d15f570daf" containerName="controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198809 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198821 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198830 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198838 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198848 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198855 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198883 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198891 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="extract-content" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198907 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198915 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="extract-utilities" Dec 03 14:10:12 crc kubenswrapper[5004]: E1203 14:10:12.198925 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6896529a-3a6e-4ed1-95ef-69f76b042585" containerName="route-controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.198933 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6896529a-3a6e-4ed1-95ef-69f76b042585" containerName="route-controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199060 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e833e1-0728-440d-b3ce-ab5b89b963ef" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199074 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f94685-023c-4305-b816-37f10184a670" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199083 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6896529a-3a6e-4ed1-95ef-69f76b042585" containerName="route-controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199095 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13001d1-8878-499b-87c3-7730c30b1a5c" containerName="oauth-openshift" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199105 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b49f439-9668-4e11-92b3-03bac9e07f39" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199113 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="45de6168-4622-4916-84f1-c0d15f570daf" containerName="controller-manager" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199123 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7beb6a23-0e72-4008-a9a4-f20d972a2500" containerName="registry-server" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.199615 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.203991 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9"] Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.204592 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.205008 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.205426 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.205908 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.205966 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.206393 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.211733 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.218768 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.219532 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.219581 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.221725 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8665c5968f-l4682"] Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.223937 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.225055 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.228218 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.228968 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9"] Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.230973 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247399 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaea8b42-c5de-40b0-8d03-105a169a05de-serving-cert\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247457 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-client-ca\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247502 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff787\" (UniqueName: \"kubernetes.io/projected/aaea8b42-c5de-40b0-8d03-105a169a05de-kube-api-access-ff787\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247522 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-client-ca\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247547 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-config\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247563 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a1db8a-5e30-410f-aa93-73092265eeeb-serving-cert\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247583 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lj89\" (UniqueName: \"kubernetes.io/projected/38a1db8a-5e30-410f-aa93-73092265eeeb-kube-api-access-6lj89\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247615 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-config\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.247633 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-proxy-ca-bundles\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.348444 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-proxy-ca-bundles\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.348882 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaea8b42-c5de-40b0-8d03-105a169a05de-serving-cert\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.348986 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-client-ca\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff787\" (UniqueName: \"kubernetes.io/projected/aaea8b42-c5de-40b0-8d03-105a169a05de-kube-api-access-ff787\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349177 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-client-ca\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349216 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-config\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349259 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a1db8a-5e30-410f-aa93-73092265eeeb-serving-cert\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lj89\" (UniqueName: \"kubernetes.io/projected/38a1db8a-5e30-410f-aa93-73092265eeeb-kube-api-access-6lj89\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.349354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-config\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.350192 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-client-ca\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.350387 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-client-ca\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.350568 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-proxy-ca-bundles\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.351145 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaea8b42-c5de-40b0-8d03-105a169a05de-config\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.352734 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaea8b42-c5de-40b0-8d03-105a169a05de-serving-cert\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.353513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a1db8a-5e30-410f-aa93-73092265eeeb-serving-cert\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.354413 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a1db8a-5e30-410f-aa93-73092265eeeb-config\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.372106 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lj89\" (UniqueName: \"kubernetes.io/projected/38a1db8a-5e30-410f-aa93-73092265eeeb-kube-api-access-6lj89\") pod \"controller-manager-8665c5968f-l4682\" (UID: \"38a1db8a-5e30-410f-aa93-73092265eeeb\") " pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.380489 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff787\" (UniqueName: \"kubernetes.io/projected/aaea8b42-c5de-40b0-8d03-105a169a05de-kube-api-access-ff787\") pod \"route-controller-manager-5c7b8457d-rhfd9\" (UID: \"aaea8b42-c5de-40b0-8d03-105a169a05de\") " pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.543639 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.556568 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.780661 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8665c5968f-l4682"] Dec 03 14:10:12 crc kubenswrapper[5004]: W1203 14:10:12.786446 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a1db8a_5e30_410f_aa93_73092265eeeb.slice/crio-02ace8a71405b34f987b03bfdb30f9a60730ed5792f959bc801f9b2e2a26c35a WatchSource:0}: Error finding container 02ace8a71405b34f987b03bfdb30f9a60730ed5792f959bc801f9b2e2a26c35a: Status 404 returned error can't find the container with id 02ace8a71405b34f987b03bfdb30f9a60730ed5792f959bc801f9b2e2a26c35a Dec 03 14:10:12 crc kubenswrapper[5004]: I1203 14:10:12.836948 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9"] Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.174000 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" event={"ID":"38a1db8a-5e30-410f-aa93-73092265eeeb","Type":"ContainerStarted","Data":"486951f1df7b7857df4c12e617b9435df8324942195bbd1ab71c73e3a62c7c7e"} Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.174072 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" event={"ID":"38a1db8a-5e30-410f-aa93-73092265eeeb","Type":"ContainerStarted","Data":"02ace8a71405b34f987b03bfdb30f9a60730ed5792f959bc801f9b2e2a26c35a"} Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.174524 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.176125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" event={"ID":"aaea8b42-c5de-40b0-8d03-105a169a05de","Type":"ContainerStarted","Data":"cd2c90de8b857ff7434a995013ce27daf3775ddcef90b9e17e88945eea8922bf"} Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.176154 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" event={"ID":"aaea8b42-c5de-40b0-8d03-105a169a05de","Type":"ContainerStarted","Data":"73b0b66adb4add3faae31905ba9572aedd8d3bbe1128201bc8f43d9dc3a32e41"} Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.176362 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.179294 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.206224 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8665c5968f-l4682" podStartSLOduration=3.206204542 podStartE2EDuration="3.206204542s" podCreationTimestamp="2025-12-03 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:10:13.202072223 +0000 UTC m=+225.951042459" watchObservedRunningTime="2025-12-03 14:10:13.206204542 +0000 UTC m=+225.955174788" Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.219260 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" podStartSLOduration=3.219227628 podStartE2EDuration="3.219227628s" podCreationTimestamp="2025-12-03 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:10:13.217701143 +0000 UTC m=+225.966671389" watchObservedRunningTime="2025-12-03 14:10:13.219227628 +0000 UTC m=+225.968197864" Dec 03 14:10:13 crc kubenswrapper[5004]: I1203 14:10:13.458294 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c7b8457d-rhfd9" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419120 5004 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419685 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c" gracePeriod=15 Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419746 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303" gracePeriod=15 Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419845 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2" gracePeriod=15 Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419852 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111" gracePeriod=15 Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.419886 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef" gracePeriod=15 Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.420674 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.420923 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.420938 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.420951 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.420958 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.420970 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.420979 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.420993 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421000 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.421015 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421021 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.421031 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421038 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421134 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421145 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421155 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421165 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421175 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.421281 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421290 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.421400 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.423141 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.424065 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.427338 5004 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.455865 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590412 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590457 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590490 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590529 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590547 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590566 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590624 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.590642 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692217 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692272 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692476 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692527 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692556 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692611 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692630 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692666 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692748 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.692952 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.693039 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.693085 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.693117 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.693089 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.693182 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: I1203 14:10:15.752170 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:10:15 crc kubenswrapper[5004]: W1203 14:10:15.774249 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-937347340e5b81adef99c21341c6cd547c993aa31840a5f987ef5d2f357c4c67 WatchSource:0}: Error finding container 937347340e5b81adef99c21341c6cd547c993aa31840a5f987ef5d2f357c4c67: Status 404 returned error can't find the container with id 937347340e5b81adef99c21341c6cd547c993aa31840a5f987ef5d2f357c4c67 Dec 03 14:10:15 crc kubenswrapper[5004]: E1203 14:10:15.777080 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db9e38142f71d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,LastTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.195671 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.197479 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.198752 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303" exitCode=0 Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.198796 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111" exitCode=0 Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.198806 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2" exitCode=0 Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.198817 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef" exitCode=2 Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.198976 5004 scope.go:117] "RemoveContainer" containerID="3b153fc3dfdc655ee95c19863d28184c3e08e0952473158d5c9209404455c3c2" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.202083 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20"} Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.202163 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"937347340e5b81adef99c21341c6cd547c993aa31840a5f987ef5d2f357c4c67"} Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.202965 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.205020 5004 generic.go:334] "Generic (PLEG): container finished" podID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" containerID="c168f273699c41bfb08b77bca0ee0c49319fb136c03026f49e0da55ce79494c4" exitCode=0 Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.205093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5cbc28ec-8596-4137-97d8-7c0d3f19043c","Type":"ContainerDied","Data":"c168f273699c41bfb08b77bca0ee0c49319fb136c03026f49e0da55ce79494c4"} Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.205975 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:16 crc kubenswrapper[5004]: I1203 14:10:16.206485 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:16 crc kubenswrapper[5004]: E1203 14:10:16.703742 5004 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" volumeName="registry-storage" Dec 03 14:10:16 crc kubenswrapper[5004]: E1203 14:10:16.918400 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db9e38142f71d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,LastTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.213686 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.545703 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.546571 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.547323 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.616783 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.617200 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access\") pod \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718489 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir\") pod \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718520 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock\") pod \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\" (UID: \"5cbc28ec-8596-4137-97d8-7c0d3f19043c\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718664 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5cbc28ec-8596-4137-97d8-7c0d3f19043c" (UID: "5cbc28ec-8596-4137-97d8-7c0d3f19043c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718758 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock" (OuterVolumeSpecName: "var-lock") pod "5cbc28ec-8596-4137-97d8-7c0d3f19043c" (UID: "5cbc28ec-8596-4137-97d8-7c0d3f19043c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.718977 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.719003 5004 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5cbc28ec-8596-4137-97d8-7c0d3f19043c-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.723337 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5cbc28ec-8596-4137-97d8-7c0d3f19043c" (UID: "5cbc28ec-8596-4137-97d8-7c0d3f19043c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.810228 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.811345 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.812151 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.812671 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.813277 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.820125 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cbc28ec-8596-4137-97d8-7c0d3f19043c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921323 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921396 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921446 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921484 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921505 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.921531 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.922162 5004 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.922207 5004 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:17 crc kubenswrapper[5004]: I1203 14:10:17.922233 5004 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.224683 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5cbc28ec-8596-4137-97d8-7c0d3f19043c","Type":"ContainerDied","Data":"2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f"} Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.224721 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.224746 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fed5cf6c8a261e5286386a0fad09d21429f5e6ee163a0ad48e9993f1b8d910f" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.229371 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.230627 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c" exitCode=0 Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.230734 5004 scope.go:117] "RemoveContainer" containerID="80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.230788 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.243621 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.244105 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.244324 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.251823 5004 scope.go:117] "RemoveContainer" containerID="f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.269483 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.269740 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.269982 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.270452 5004 scope.go:117] "RemoveContainer" containerID="a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.285738 5004 scope.go:117] "RemoveContainer" containerID="5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.302148 5004 scope.go:117] "RemoveContainer" containerID="5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.320965 5004 scope.go:117] "RemoveContainer" containerID="534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.338880 5004 scope.go:117] "RemoveContainer" containerID="80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.340291 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\": container with ID starting with 80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303 not found: ID does not exist" containerID="80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.340335 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303"} err="failed to get container status \"80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\": rpc error: code = NotFound desc = could not find container \"80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303\": container with ID starting with 80185571d03f2c1716636b32c19a80238c872938672b5cbe355b2a9220927303 not found: ID does not exist" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.340361 5004 scope.go:117] "RemoveContainer" containerID="f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.340602 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\": container with ID starting with f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111 not found: ID does not exist" containerID="f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.340647 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111"} err="failed to get container status \"f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\": rpc error: code = NotFound desc = could not find container \"f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111\": container with ID starting with f6cd56a9a36203c843bf7ee8dbb0c29b1259d3b770cb8ebd6c4071bb224e4111 not found: ID does not exist" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.340675 5004 scope.go:117] "RemoveContainer" containerID="a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.341026 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\": container with ID starting with a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2 not found: ID does not exist" containerID="a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341065 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2"} err="failed to get container status \"a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\": rpc error: code = NotFound desc = could not find container \"a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2\": container with ID starting with a0e21d472d7cbc03ecd25d2daa5381ad40d0757277556d7af922bcc869cd0ac2 not found: ID does not exist" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341091 5004 scope.go:117] "RemoveContainer" containerID="5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.341318 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\": container with ID starting with 5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef not found: ID does not exist" containerID="5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341344 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef"} err="failed to get container status \"5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\": rpc error: code = NotFound desc = could not find container \"5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef\": container with ID starting with 5bcd7bb25025f3cf8b77bc841219901e7d4f0d55a8bd8aba409896c65e72f0ef not found: ID does not exist" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341358 5004 scope.go:117] "RemoveContainer" containerID="5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.341664 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\": container with ID starting with 5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c not found: ID does not exist" containerID="5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341684 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c"} err="failed to get container status \"5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\": rpc error: code = NotFound desc = could not find container \"5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c\": container with ID starting with 5603b959482f14f962dfd2f270ad3d2338c78399ad4fcab557da46026853c28c not found: ID does not exist" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.341696 5004 scope.go:117] "RemoveContainer" containerID="534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5" Dec 03 14:10:18 crc kubenswrapper[5004]: E1203 14:10:18.342403 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\": container with ID starting with 534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5 not found: ID does not exist" containerID="534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5" Dec 03 14:10:18 crc kubenswrapper[5004]: I1203 14:10:18.342442 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5"} err="failed to get container status \"534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\": rpc error: code = NotFound desc = could not find container \"534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5\": container with ID starting with 534442443ddf912d5a9902962c195df8acff24073002e95d131a9b8436c66af5 not found: ID does not exist" Dec 03 14:10:19 crc kubenswrapper[5004]: I1203 14:10:19.621091 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.041198 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.043156 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.043571 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.044335 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.044710 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:23 crc kubenswrapper[5004]: I1203 14:10:23.044741 5004 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.045069 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.246684 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 03 14:10:23 crc kubenswrapper[5004]: E1203 14:10:23.647688 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 03 14:10:24 crc kubenswrapper[5004]: E1203 14:10:24.449429 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 03 14:10:26 crc kubenswrapper[5004]: E1203 14:10:26.051066 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Dec 03 14:10:26 crc kubenswrapper[5004]: E1203 14:10:26.919514 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187db9e38142f71d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,LastTimestamp:2025-12-03 14:10:15.776466717 +0000 UTC m=+228.525436953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 14:10:27 crc kubenswrapper[5004]: I1203 14:10:27.618024 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:27 crc kubenswrapper[5004]: I1203 14:10:27.619854 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.297025 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.297292 5004 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e" exitCode=1 Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.297319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e"} Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.297732 5004 scope.go:117] "RemoveContainer" containerID="50ecbf6f2dcd8fc85d5e49e08fb94e3b7ded741fe4fffb6c73bc519a0531302e" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.298278 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.299049 5004 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.300261 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.565792 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.612475 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.613407 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.613869 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.614402 5004 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.627296 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.627368 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:28 crc kubenswrapper[5004]: E1203 14:10:28.627841 5004 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:28 crc kubenswrapper[5004]: I1203 14:10:28.628521 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:28 crc kubenswrapper[5004]: W1203 14:10:28.654088 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f14a84d70011b0113e2bc520321a59b0948157337683c338968dfb2bbc9798c2 WatchSource:0}: Error finding container f14a84d70011b0113e2bc520321a59b0948157337683c338968dfb2bbc9798c2: Status 404 returned error can't find the container with id f14a84d70011b0113e2bc520321a59b0948157337683c338968dfb2bbc9798c2 Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.306386 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.306803 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a73a4bf226af407c89a5c7224f13e88a482f00147d1cca80d38702992823f363"} Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.307912 5004 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.308092 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.308262 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.310569 5004 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f8292be0b1c373a5f4a4c5070eda19d03c0657724c4affb1e6c8080b64294060" exitCode=0 Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.310600 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f8292be0b1c373a5f4a4c5070eda19d03c0657724c4affb1e6c8080b64294060"} Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.310616 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f14a84d70011b0113e2bc520321a59b0948157337683c338968dfb2bbc9798c2"} Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.310812 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.310825 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:29 crc kubenswrapper[5004]: E1203 14:10:29.311051 5004 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.311489 5004 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.311935 5004 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:29 crc kubenswrapper[5004]: E1203 14:10:29.312062 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="6.4s" Dec 03 14:10:29 crc kubenswrapper[5004]: I1203 14:10:29.312276 5004 status_manager.go:851] "Failed to get status for pod" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 03 14:10:30 crc kubenswrapper[5004]: I1203 14:10:30.319093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e96171e64a68bfc86fbbc6f03e86598e677fa25fe18a5f2d41ef560e34b92bdf"} Dec 03 14:10:30 crc kubenswrapper[5004]: I1203 14:10:30.319521 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"47e303b8195398d486531bb5728cb8ca5f526c1b361f4d365bb1a959ee539d75"} Dec 03 14:10:30 crc kubenswrapper[5004]: I1203 14:10:30.319561 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22c9e91c74fcf280ff807b79d003b14ccc05afc08dec99afaee99019e4bc78e0"} Dec 03 14:10:30 crc kubenswrapper[5004]: I1203 14:10:30.319588 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c8ab1aab53460f9fa7a775b46d9845a0d5b4a05c8f1eeed01287189b0b128b8"} Dec 03 14:10:31 crc kubenswrapper[5004]: I1203 14:10:31.328634 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42fd6784f72e791b40938052bd3f9cc1cf9673a98905635fcf8ec3b9ff24fbff"} Dec 03 14:10:31 crc kubenswrapper[5004]: I1203 14:10:31.328817 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:31 crc kubenswrapper[5004]: I1203 14:10:31.328936 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:31 crc kubenswrapper[5004]: I1203 14:10:31.328963 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:33 crc kubenswrapper[5004]: I1203 14:10:33.136484 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:10:33 crc kubenswrapper[5004]: I1203 14:10:33.628998 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:33 crc kubenswrapper[5004]: I1203 14:10:33.629068 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:33 crc kubenswrapper[5004]: I1203 14:10:33.635236 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:36 crc kubenswrapper[5004]: I1203 14:10:36.337007 5004 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:36 crc kubenswrapper[5004]: I1203 14:10:36.353641 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:36 crc kubenswrapper[5004]: I1203 14:10:36.353884 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:36 crc kubenswrapper[5004]: I1203 14:10:36.373922 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:37 crc kubenswrapper[5004]: I1203 14:10:37.361349 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:37 crc kubenswrapper[5004]: I1203 14:10:37.362788 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:37 crc kubenswrapper[5004]: I1203 14:10:37.629251 5004 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="10b73293-9e0b-4235-9efb-983af750e5ff" Dec 03 14:10:38 crc kubenswrapper[5004]: I1203 14:10:38.565829 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:10:38 crc kubenswrapper[5004]: I1203 14:10:38.565954 5004 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 14:10:38 crc kubenswrapper[5004]: I1203 14:10:38.566012 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 14:10:45 crc kubenswrapper[5004]: I1203 14:10:45.651476 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 14:10:45 crc kubenswrapper[5004]: I1203 14:10:45.976263 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.216174 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.373842 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.431847 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.476226 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.688662 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.812596 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.821359 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 14:10:46 crc kubenswrapper[5004]: I1203 14:10:46.986728 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.248181 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.266580 5004 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.310689 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.442759 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.654337 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.762375 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 14:10:47 crc kubenswrapper[5004]: I1203 14:10:47.853137 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.051013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.095496 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.343393 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.360801 5004 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.566734 5004 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.566830 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.581949 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.693334 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.719744 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.736818 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.939127 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.946480 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 14:10:48 crc kubenswrapper[5004]: I1203 14:10:48.952097 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.058208 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.124741 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.206565 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.379830 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.700137 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.712824 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.742877 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.779771 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.797729 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 14:10:49 crc kubenswrapper[5004]: I1203 14:10:49.995329 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.004071 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.059039 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.063409 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.065553 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.124430 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.137927 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.385828 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.479510 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.499677 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.744079 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.782698 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.811923 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.837117 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.837387 5004 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.837745 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.837729071 podStartE2EDuration="35.837729071s" podCreationTimestamp="2025-12-03 14:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:10:36.348728387 +0000 UTC m=+249.097698623" watchObservedRunningTime="2025-12-03 14:10:50.837729071 +0000 UTC m=+263.586699317" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843252 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843313 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7544d6d989-2kr6w","openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 14:10:50 crc kubenswrapper[5004]: E1203 14:10:50.843516 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" containerName="installer" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843537 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" containerName="installer" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843680 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cbc28ec-8596-4137-97d8-7c0d3f19043c" containerName="installer" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843832 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.843895 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5464e9f5-0e39-426f-b9da-d5959948f8fe" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.844124 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847334 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847472 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847670 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847718 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847767 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.847943 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.848822 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.849958 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.849985 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.850069 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.850440 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.850441 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.851237 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.863751 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.864963 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.870170 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.885047 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.895913 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.89584966 podStartE2EDuration="14.89584966s" podCreationTimestamp="2025-12-03 14:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:10:50.889941285 +0000 UTC m=+263.638911551" watchObservedRunningTime="2025-12-03 14:10:50.89584966 +0000 UTC m=+263.644819936" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940008 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-policies\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940085 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940312 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940368 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940393 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940428 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940466 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4xl\" (UniqueName: \"kubernetes.io/projected/0f95eaec-df2e-48c0-b2bb-f5f348129df0-kube-api-access-mm4xl\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940502 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940543 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-session\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940584 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-error\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940684 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940754 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-dir\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:50 crc kubenswrapper[5004]: I1203 14:10:50.940801 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-login\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.019288 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-session\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-error\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042290 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-dir\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-login\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042331 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-policies\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.042421 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-dir\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043083 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-audit-policies\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043193 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043264 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043317 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.043353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.044213 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.044603 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.044817 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.044933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.044977 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4xl\" (UniqueName: \"kubernetes.io/projected/0f95eaec-df2e-48c0-b2bb-f5f348129df0-kube-api-access-mm4xl\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.045019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.048238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-session\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.048238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.048802 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-error\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.048829 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-login\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.049283 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.049918 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.050060 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.052674 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f95eaec-df2e-48c0-b2bb-f5f348129df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.062616 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4xl\" (UniqueName: \"kubernetes.io/projected/0f95eaec-df2e-48c0-b2bb-f5f348129df0-kube-api-access-mm4xl\") pod \"oauth-openshift-7544d6d989-2kr6w\" (UID: \"0f95eaec-df2e-48c0-b2bb-f5f348129df0\") " pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.127731 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.144515 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.163632 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.272106 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.275755 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.314886 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.353899 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.383190 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.391805 5004 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.519135 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.533489 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.590606 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.707158 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.715554 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.874411 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.893992 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.978966 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 14:10:51 crc kubenswrapper[5004]: I1203 14:10:51.982341 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.028156 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.063384 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.182399 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.187638 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.260489 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.273829 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.287687 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.360617 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.371829 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.402247 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.417027 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.648300 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.738801 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 14:10:52 crc kubenswrapper[5004]: I1203 14:10:52.919609 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.017663 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.037012 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.181504 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.190495 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.302346 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.309176 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.367993 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.449288 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.621361 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.724797 5004 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.806008 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.813135 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.867789 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 14:10:53 crc kubenswrapper[5004]: I1203 14:10:53.882545 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.011851 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.090806 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.123643 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.136354 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.173807 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.176154 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.180250 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.209092 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7544d6d989-2kr6w"] Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.215735 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.309698 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.357006 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.369402 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.397182 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.408293 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.422416 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.463705 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.531439 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.537469 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.627017 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.671795 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.682618 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.907237 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:10:54 crc kubenswrapper[5004]: E1203 14:10:54.955330 5004 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 14:10:54 crc kubenswrapper[5004]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7544d6d989-2kr6w_openshift-authentication_0f95eaec-df2e-48c0-b2bb-f5f348129df0_0(0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f): error adding pod openshift-authentication_oauth-openshift-7544d6d989-2kr6w to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f" Netns:"/var/run/netns/e5e71ddd-2e98-4178-97d9-ef0320f304d5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7544d6d989-2kr6w;K8S_POD_INFRA_CONTAINER_ID=0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f;K8S_POD_UID=0f95eaec-df2e-48c0-b2bb-f5f348129df0" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7544d6d989-2kr6w] networking: Multus: [openshift-authentication/oauth-openshift-7544d6d989-2kr6w/0f95eaec-df2e-48c0-b2bb-f5f348129df0]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7544d6d989-2kr6w in out of cluster comm: pod "oauth-openshift-7544d6d989-2kr6w" not found Dec 03 14:10:54 crc kubenswrapper[5004]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 14:10:54 crc kubenswrapper[5004]: > Dec 03 14:10:54 crc kubenswrapper[5004]: E1203 14:10:54.955769 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 14:10:54 crc kubenswrapper[5004]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7544d6d989-2kr6w_openshift-authentication_0f95eaec-df2e-48c0-b2bb-f5f348129df0_0(0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f): error adding pod openshift-authentication_oauth-openshift-7544d6d989-2kr6w to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f" Netns:"/var/run/netns/e5e71ddd-2e98-4178-97d9-ef0320f304d5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7544d6d989-2kr6w;K8S_POD_INFRA_CONTAINER_ID=0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f;K8S_POD_UID=0f95eaec-df2e-48c0-b2bb-f5f348129df0" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7544d6d989-2kr6w] networking: Multus: [openshift-authentication/oauth-openshift-7544d6d989-2kr6w/0f95eaec-df2e-48c0-b2bb-f5f348129df0]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7544d6d989-2kr6w in out of cluster comm: pod "oauth-openshift-7544d6d989-2kr6w" not found Dec 03 14:10:54 crc kubenswrapper[5004]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 14:10:54 crc kubenswrapper[5004]: > pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:54 crc kubenswrapper[5004]: E1203 14:10:54.955796 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 14:10:54 crc kubenswrapper[5004]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7544d6d989-2kr6w_openshift-authentication_0f95eaec-df2e-48c0-b2bb-f5f348129df0_0(0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f): error adding pod openshift-authentication_oauth-openshift-7544d6d989-2kr6w to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f" Netns:"/var/run/netns/e5e71ddd-2e98-4178-97d9-ef0320f304d5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7544d6d989-2kr6w;K8S_POD_INFRA_CONTAINER_ID=0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f;K8S_POD_UID=0f95eaec-df2e-48c0-b2bb-f5f348129df0" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7544d6d989-2kr6w] networking: Multus: [openshift-authentication/oauth-openshift-7544d6d989-2kr6w/0f95eaec-df2e-48c0-b2bb-f5f348129df0]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7544d6d989-2kr6w in out of cluster comm: pod "oauth-openshift-7544d6d989-2kr6w" not found Dec 03 14:10:54 crc kubenswrapper[5004]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 14:10:54 crc kubenswrapper[5004]: > pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:54 crc kubenswrapper[5004]: E1203 14:10:54.955877 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7544d6d989-2kr6w_openshift-authentication(0f95eaec-df2e-48c0-b2bb-f5f348129df0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7544d6d989-2kr6w_openshift-authentication(0f95eaec-df2e-48c0-b2bb-f5f348129df0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7544d6d989-2kr6w_openshift-authentication_0f95eaec-df2e-48c0-b2bb-f5f348129df0_0(0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f): error adding pod openshift-authentication_oauth-openshift-7544d6d989-2kr6w to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f\\\" Netns:\\\"/var/run/netns/e5e71ddd-2e98-4178-97d9-ef0320f304d5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7544d6d989-2kr6w;K8S_POD_INFRA_CONTAINER_ID=0fb6c1a7b5c773856c52daac8c9703f9e2c46e67fe2a512eb8c794ac56acfe5f;K8S_POD_UID=0f95eaec-df2e-48c0-b2bb-f5f348129df0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7544d6d989-2kr6w] networking: Multus: [openshift-authentication/oauth-openshift-7544d6d989-2kr6w/0f95eaec-df2e-48c0-b2bb-f5f348129df0]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7544d6d989-2kr6w in out of cluster comm: pod \\\"oauth-openshift-7544d6d989-2kr6w\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" podUID="0f95eaec-df2e-48c0-b2bb-f5f348129df0" Dec 03 14:10:54 crc kubenswrapper[5004]: I1203 14:10:54.979946 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.021424 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.060222 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.068046 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.082169 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.345512 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.467947 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.473510 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.474191 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.516074 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.549991 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.553822 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.577251 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.582556 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.631889 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.650129 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.761708 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.786037 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.786082 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.804853 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.829448 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7544d6d989-2kr6w"] Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.877570 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 14:10:55 crc kubenswrapper[5004]: I1203 14:10:55.895643 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.002290 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.043805 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.080710 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.098017 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.204498 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.279066 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.300473 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.304311 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.480403 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" event={"ID":"0f95eaec-df2e-48c0-b2bb-f5f348129df0","Type":"ContainerStarted","Data":"42a0f033e4b4115f0b5fe89b81e53648158d7a7285113de92385e5ed970d20fc"} Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.480452 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" event={"ID":"0f95eaec-df2e-48c0-b2bb-f5f348129df0","Type":"ContainerStarted","Data":"2424b3e78a3f8205829610ac7fa3011bc4a9b3534dc596dc680b101bd0a43baa"} Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.482031 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.511550 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" podStartSLOduration=74.511529451 podStartE2EDuration="1m14.511529451s" podCreationTimestamp="2025-12-03 14:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:10:56.506268603 +0000 UTC m=+269.255238859" watchObservedRunningTime="2025-12-03 14:10:56.511529451 +0000 UTC m=+269.260499707" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.551348 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.644182 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.706133 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7544d6d989-2kr6w" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.806788 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.861189 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 14:10:56 crc kubenswrapper[5004]: I1203 14:10:56.987542 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.060993 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.069778 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.077186 5004 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.096901 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.136815 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.175811 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.310291 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.385738 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.633575 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.638311 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.725619 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.759146 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.766389 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 14:10:57 crc kubenswrapper[5004]: I1203 14:10:57.801655 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.009897 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.137088 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.139302 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.201046 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.206939 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.224582 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.263109 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.277403 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.328271 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.362664 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.409794 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.435375 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.489804 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.500794 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.573729 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.579734 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.600521 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.626829 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.667626 5004 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.667882 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20" gracePeriod=5 Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.725534 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.788613 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.801813 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.832082 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.871324 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 14:10:58 crc kubenswrapper[5004]: I1203 14:10:58.909780 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.005987 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.184772 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.419241 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.428890 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.489660 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.494176 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.582429 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.584965 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.598108 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.614535 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.686058 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.827352 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.828576 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.843548 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.904013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 14:10:59 crc kubenswrapper[5004]: I1203 14:10:59.922120 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.026735 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.067844 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.122742 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.218938 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.288974 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.312190 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.364709 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.400216 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.412621 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.565943 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.708457 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.736562 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.827561 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.835047 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.891365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 14:11:00 crc kubenswrapper[5004]: I1203 14:11:00.898265 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.107555 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.199475 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.258717 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.279280 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.382207 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.401782 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.684373 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.704621 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.736342 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.898045 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 14:11:01 crc kubenswrapper[5004]: I1203 14:11:01.975561 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 14:11:02 crc kubenswrapper[5004]: I1203 14:11:02.532881 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 14:11:02 crc kubenswrapper[5004]: I1203 14:11:02.568991 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 14:11:02 crc kubenswrapper[5004]: I1203 14:11:02.823960 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.310167 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.310246 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.382204 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410043 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410159 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410261 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410314 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410347 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410403 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410462 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410471 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410565 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410796 5004 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410809 5004 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410817 5004 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.410825 5004 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.419719 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.516937 5004 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.522106 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.527941 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.528001 5004 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20" exitCode=137 Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.528059 5004 scope.go:117] "RemoveContainer" containerID="3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.528104 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.550146 5004 scope.go:117] "RemoveContainer" containerID="3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20" Dec 03 14:11:04 crc kubenswrapper[5004]: E1203 14:11:04.550600 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20\": container with ID starting with 3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20 not found: ID does not exist" containerID="3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20" Dec 03 14:11:04 crc kubenswrapper[5004]: I1203 14:11:04.550653 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20"} err="failed to get container status \"3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20\": rpc error: code = NotFound desc = could not find container \"3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20\": container with ID starting with 3f3c01a10c74f5c86f1f1fce60d3632d92b446b4c1c378f58d54b2bad6f39d20 not found: ID does not exist" Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.619449 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.619762 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.630416 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.630467 5004 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e2649e97-b77b-4528-a381-56084b8e6ba4" Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.634639 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 14:11:05 crc kubenswrapper[5004]: I1203 14:11:05.634665 5004 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e2649e97-b77b-4528-a381-56084b8e6ba4" Dec 03 14:11:22 crc kubenswrapper[5004]: I1203 14:11:22.628990 5004 generic.go:334] "Generic (PLEG): container finished" podID="98002580-e0a7-49b9-9258-222fd6901e29" containerID="a17a3fbeefeb9762061857ac31ff8b7e7fa4e5a5fcbd3f222153ade5f780df51" exitCode=0 Dec 03 14:11:22 crc kubenswrapper[5004]: I1203 14:11:22.629094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerDied","Data":"a17a3fbeefeb9762061857ac31ff8b7e7fa4e5a5fcbd3f222153ade5f780df51"} Dec 03 14:11:22 crc kubenswrapper[5004]: I1203 14:11:22.630450 5004 scope.go:117] "RemoveContainer" containerID="a17a3fbeefeb9762061857ac31ff8b7e7fa4e5a5fcbd3f222153ade5f780df51" Dec 03 14:11:23 crc kubenswrapper[5004]: I1203 14:11:23.635513 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerStarted","Data":"7e3124406a1abeb65441c5f84b0fcc7d478e4f016dff6720317a3a4db5c35091"} Dec 03 14:11:23 crc kubenswrapper[5004]: I1203 14:11:23.636235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:11:23 crc kubenswrapper[5004]: I1203 14:11:23.638477 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:11:27 crc kubenswrapper[5004]: I1203 14:11:27.483978 5004 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.501809 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h5xj"] Dec 03 14:12:09 crc kubenswrapper[5004]: E1203 14:12:09.502539 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.502551 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.502651 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.503072 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.551742 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h5xj"] Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.626848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-registry-certificates\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.626961 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0200560-1870-4841-b74a-9eec2f50efef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627016 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627107 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-bound-sa-token\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627170 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzqt\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-kube-api-access-6lzqt\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627205 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0200560-1870-4841-b74a-9eec2f50efef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627246 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-registry-tls\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.627307 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-trusted-ca\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.661444 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.728708 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-bound-sa-token\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.728895 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzqt\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-kube-api-access-6lzqt\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.728960 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0200560-1870-4841-b74a-9eec2f50efef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.729008 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-registry-tls\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.729094 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-trusted-ca\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.729159 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-registry-certificates\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.729269 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0200560-1870-4841-b74a-9eec2f50efef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.729506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0200560-1870-4841-b74a-9eec2f50efef-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.730438 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-registry-certificates\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.731434 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0200560-1870-4841-b74a-9eec2f50efef-trusted-ca\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.734975 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-registry-tls\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.735043 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0200560-1870-4841-b74a-9eec2f50efef-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.747949 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzqt\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-kube-api-access-6lzqt\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.748474 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0200560-1870-4841-b74a-9eec2f50efef-bound-sa-token\") pod \"image-registry-66df7c8f76-8h5xj\" (UID: \"d0200560-1870-4841-b74a-9eec2f50efef\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:09 crc kubenswrapper[5004]: I1203 14:12:09.822187 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:10 crc kubenswrapper[5004]: I1203 14:12:10.032864 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h5xj"] Dec 03 14:12:10 crc kubenswrapper[5004]: I1203 14:12:10.893400 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" event={"ID":"d0200560-1870-4841-b74a-9eec2f50efef","Type":"ContainerStarted","Data":"b387d12d703c33ff23eca72f8c5797675850ac4112366da91b4916783089df04"} Dec 03 14:12:10 crc kubenswrapper[5004]: I1203 14:12:10.893452 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" event={"ID":"d0200560-1870-4841-b74a-9eec2f50efef","Type":"ContainerStarted","Data":"68f9260c43a969f4ee76c96a26f87347c3a104ce489076d65f273b7272884e93"} Dec 03 14:12:10 crc kubenswrapper[5004]: I1203 14:12:10.893628 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:10 crc kubenswrapper[5004]: I1203 14:12:10.914955 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" podStartSLOduration=1.9149373459999999 podStartE2EDuration="1.914937346s" podCreationTimestamp="2025-12-03 14:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:12:10.913592578 +0000 UTC m=+343.662562814" watchObservedRunningTime="2025-12-03 14:12:10.914937346 +0000 UTC m=+343.663907582" Dec 03 14:12:22 crc kubenswrapper[5004]: I1203 14:12:22.824895 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:12:22 crc kubenswrapper[5004]: I1203 14:12:22.826079 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:12:29 crc kubenswrapper[5004]: I1203 14:12:29.830616 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8h5xj" Dec 03 14:12:29 crc kubenswrapper[5004]: I1203 14:12:29.944066 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.557664 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.559621 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lkn6v" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="registry-server" containerID="cri-o://294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" gracePeriod=30 Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.571558 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.571817 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f5tn4" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="registry-server" containerID="cri-o://39c0fb8adab88fb69def818b6a964b82e3aa8314037fa7f20b937465db21df33" gracePeriod=30 Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.586332 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.586569 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" containerID="cri-o://7e3124406a1abeb65441c5f84b0fcc7d478e4f016dff6720317a3a4db5c35091" gracePeriod=30 Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.590967 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.591143 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qrsqv" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="registry-server" containerID="cri-o://93a9c33f3487a8e4fb70ec9c101c323f058deb598d7dfa0b6ea1339df89e75b1" gracePeriod=30 Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.600177 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x2qkd"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.600970 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.601982 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.602185 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9m4fr" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="registry-server" containerID="cri-o://6766784ce9d042e70b4e6ffd2cfd652010879144341f3a67b2fc0265d2e9e3ad" gracePeriod=30 Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.623793 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x2qkd"] Dec 03 14:12:31 crc kubenswrapper[5004]: E1203 14:12:31.717486 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 is running failed: container process not found" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 14:12:31 crc kubenswrapper[5004]: E1203 14:12:31.717915 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 is running failed: container process not found" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 14:12:31 crc kubenswrapper[5004]: E1203 14:12:31.718248 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 is running failed: container process not found" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 14:12:31 crc kubenswrapper[5004]: E1203 14:12:31.718272 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-lkn6v" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="registry-server" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.747235 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjqs\" (UniqueName: \"kubernetes.io/projected/cca7b643-a679-4b89-b42d-a18c552a737b-kube-api-access-jqjqs\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.747303 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.747360 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.849003 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjqs\" (UniqueName: \"kubernetes.io/projected/cca7b643-a679-4b89-b42d-a18c552a737b-kube-api-access-jqjqs\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.849420 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.849458 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.851089 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.865971 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cca7b643-a679-4b89-b42d-a18c552a737b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.870974 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjqs\" (UniqueName: \"kubernetes.io/projected/cca7b643-a679-4b89-b42d-a18c552a737b-kube-api-access-jqjqs\") pod \"marketplace-operator-79b997595-x2qkd\" (UID: \"cca7b643-a679-4b89-b42d-a18c552a737b\") " pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.925218 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:31 crc kubenswrapper[5004]: I1203 14:12:31.997652 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.028519 5004 generic.go:334] "Generic (PLEG): container finished" podID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerID="93a9c33f3487a8e4fb70ec9c101c323f058deb598d7dfa0b6ea1339df89e75b1" exitCode=0 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.028573 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerDied","Data":"93a9c33f3487a8e4fb70ec9c101c323f058deb598d7dfa0b6ea1339df89e75b1"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.031051 5004 generic.go:334] "Generic (PLEG): container finished" podID="f97b6736-c178-4178-b21b-abeb67027c36" containerID="39c0fb8adab88fb69def818b6a964b82e3aa8314037fa7f20b937465db21df33" exitCode=0 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.031099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerDied","Data":"39c0fb8adab88fb69def818b6a964b82e3aa8314037fa7f20b937465db21df33"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.032366 5004 generic.go:334] "Generic (PLEG): container finished" podID="98002580-e0a7-49b9-9258-222fd6901e29" containerID="7e3124406a1abeb65441c5f84b0fcc7d478e4f016dff6720317a3a4db5c35091" exitCode=0 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.032437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerDied","Data":"7e3124406a1abeb65441c5f84b0fcc7d478e4f016dff6720317a3a4db5c35091"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.032462 5004 scope.go:117] "RemoveContainer" containerID="a17a3fbeefeb9762061857ac31ff8b7e7fa4e5a5fcbd3f222153ade5f780df51" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.039301 5004 generic.go:334] "Generic (PLEG): container finished" podID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerID="6766784ce9d042e70b4e6ffd2cfd652010879144341f3a67b2fc0265d2e9e3ad" exitCode=0 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.039400 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerDied","Data":"6766784ce9d042e70b4e6ffd2cfd652010879144341f3a67b2fc0265d2e9e3ad"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.052395 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content\") pod \"098255d0-cc88-4fba-bbff-b4427d1dac07\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.052512 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities\") pod \"098255d0-cc88-4fba-bbff-b4427d1dac07\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.052620 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhw4r\" (UniqueName: \"kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r\") pod \"098255d0-cc88-4fba-bbff-b4427d1dac07\" (UID: \"098255d0-cc88-4fba-bbff-b4427d1dac07\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.056659 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities" (OuterVolumeSpecName: "utilities") pod "098255d0-cc88-4fba-bbff-b4427d1dac07" (UID: "098255d0-cc88-4fba-bbff-b4427d1dac07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r" (OuterVolumeSpecName: "kube-api-access-rhw4r") pod "098255d0-cc88-4fba-bbff-b4427d1dac07" (UID: "098255d0-cc88-4fba-bbff-b4427d1dac07"). InnerVolumeSpecName "kube-api-access-rhw4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058484 5004 generic.go:334] "Generic (PLEG): container finished" podID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" exitCode=0 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerDied","Data":"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058566 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lkn6v" event={"ID":"098255d0-cc88-4fba-bbff-b4427d1dac07","Type":"ContainerDied","Data":"4f76424cdddd81cd0c9e7abe8fcc95bdc3bff03f26bc36e16add8d98509f42ce"} Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058588 5004 scope.go:117] "RemoveContainer" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.058727 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lkn6v" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.086466 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.112416 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "098255d0-cc88-4fba-bbff-b4427d1dac07" (UID: "098255d0-cc88-4fba-bbff-b4427d1dac07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.127010 5004 scope.go:117] "RemoveContainer" containerID="9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153426 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jgrf\" (UniqueName: \"kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf\") pod \"e9517257-d9b4-480b-bc6f-6424577ef33b\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153560 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities\") pod \"e9517257-d9b4-480b-bc6f-6424577ef33b\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153626 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content\") pod \"e9517257-d9b4-480b-bc6f-6424577ef33b\" (UID: \"e9517257-d9b4-480b-bc6f-6424577ef33b\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153906 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhw4r\" (UniqueName: \"kubernetes.io/projected/098255d0-cc88-4fba-bbff-b4427d1dac07-kube-api-access-rhw4r\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153932 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.153945 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098255d0-cc88-4fba-bbff-b4427d1dac07-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.154686 5004 scope.go:117] "RemoveContainer" containerID="a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.154783 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities" (OuterVolumeSpecName: "utilities") pod "e9517257-d9b4-480b-bc6f-6424577ef33b" (UID: "e9517257-d9b4-480b-bc6f-6424577ef33b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.158116 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf" (OuterVolumeSpecName: "kube-api-access-6jgrf") pod "e9517257-d9b4-480b-bc6f-6424577ef33b" (UID: "e9517257-d9b4-480b-bc6f-6424577ef33b"). InnerVolumeSpecName "kube-api-access-6jgrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.173358 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9517257-d9b4-480b-bc6f-6424577ef33b" (UID: "e9517257-d9b4-480b-bc6f-6424577ef33b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.176618 5004 scope.go:117] "RemoveContainer" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" Dec 03 14:12:32 crc kubenswrapper[5004]: E1203 14:12:32.177569 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173\": container with ID starting with 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 not found: ID does not exist" containerID="294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.177614 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173"} err="failed to get container status \"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173\": rpc error: code = NotFound desc = could not find container \"294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173\": container with ID starting with 294344fa66471c59749f94fa347cc1e4a66176da92118fa108b69f096bfbe173 not found: ID does not exist" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.177643 5004 scope.go:117] "RemoveContainer" containerID="9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8" Dec 03 14:12:32 crc kubenswrapper[5004]: E1203 14:12:32.178001 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8\": container with ID starting with 9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8 not found: ID does not exist" containerID="9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.178031 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8"} err="failed to get container status \"9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8\": rpc error: code = NotFound desc = could not find container \"9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8\": container with ID starting with 9a8915c9018dfb1aadfa0eee65e2e2abaed31be7b76498696a9c2a2f097beec8 not found: ID does not exist" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.178046 5004 scope.go:117] "RemoveContainer" containerID="a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45" Dec 03 14:12:32 crc kubenswrapper[5004]: E1203 14:12:32.178292 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45\": container with ID starting with a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45 not found: ID does not exist" containerID="a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.178316 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45"} err="failed to get container status \"a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45\": rpc error: code = NotFound desc = could not find container \"a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45\": container with ID starting with a770e8dbf408ff7947f098c55e150e732436c7f73344d11cbcfc36a217dd1b45 not found: ID does not exist" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.207079 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x2qkd"] Dec 03 14:12:32 crc kubenswrapper[5004]: W1203 14:12:32.213100 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca7b643_a679_4b89_b42d_a18c552a737b.slice/crio-9b5eb20d4e841ea3b108b04b8026cc7067f95bdbea20e625968c9c449e26a6f9 WatchSource:0}: Error finding container 9b5eb20d4e841ea3b108b04b8026cc7067f95bdbea20e625968c9c449e26a6f9: Status 404 returned error can't find the container with id 9b5eb20d4e841ea3b108b04b8026cc7067f95bdbea20e625968c9c449e26a6f9 Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.255020 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.255053 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9517257-d9b4-480b-bc6f-6424577ef33b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.255065 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jgrf\" (UniqueName: \"kubernetes.io/projected/e9517257-d9b4-480b-bc6f-6424577ef33b-kube-api-access-6jgrf\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.399596 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.399648 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lkn6v"] Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.429955 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.454204 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.483554 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.558151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq5k2\" (UniqueName: \"kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2\") pod \"98002580-e0a7-49b9-9258-222fd6901e29\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.558271 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content\") pod \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.558311 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpvb\" (UniqueName: \"kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb\") pod \"f97b6736-c178-4178-b21b-abeb67027c36\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.558971 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca\") pod \"98002580-e0a7-49b9-9258-222fd6901e29\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559093 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities\") pod \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics\") pod \"98002580-e0a7-49b9-9258-222fd6901e29\" (UID: \"98002580-e0a7-49b9-9258-222fd6901e29\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559198 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzzjs\" (UniqueName: \"kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs\") pod \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\" (UID: \"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559252 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content\") pod \"f97b6736-c178-4178-b21b-abeb67027c36\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559288 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities\") pod \"f97b6736-c178-4178-b21b-abeb67027c36\" (UID: \"f97b6736-c178-4178-b21b-abeb67027c36\") " Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559544 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "98002580-e0a7-49b9-9258-222fd6901e29" (UID: "98002580-e0a7-49b9-9258-222fd6901e29"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.559950 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.562680 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities" (OuterVolumeSpecName: "utilities") pod "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" (UID: "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.562967 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities" (OuterVolumeSpecName: "utilities") pod "f97b6736-c178-4178-b21b-abeb67027c36" (UID: "f97b6736-c178-4178-b21b-abeb67027c36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.563001 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2" (OuterVolumeSpecName: "kube-api-access-fq5k2") pod "98002580-e0a7-49b9-9258-222fd6901e29" (UID: "98002580-e0a7-49b9-9258-222fd6901e29"). InnerVolumeSpecName "kube-api-access-fq5k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.563084 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb" (OuterVolumeSpecName: "kube-api-access-ljpvb") pod "f97b6736-c178-4178-b21b-abeb67027c36" (UID: "f97b6736-c178-4178-b21b-abeb67027c36"). InnerVolumeSpecName "kube-api-access-ljpvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.564997 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs" (OuterVolumeSpecName: "kube-api-access-zzzjs") pod "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" (UID: "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd"). InnerVolumeSpecName "kube-api-access-zzzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.565062 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "98002580-e0a7-49b9-9258-222fd6901e29" (UID: "98002580-e0a7-49b9-9258-222fd6901e29"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.613549 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f97b6736-c178-4178-b21b-abeb67027c36" (UID: "f97b6736-c178-4178-b21b-abeb67027c36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662559 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662641 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f97b6736-c178-4178-b21b-abeb67027c36-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662663 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq5k2\" (UniqueName: \"kubernetes.io/projected/98002580-e0a7-49b9-9258-222fd6901e29-kube-api-access-fq5k2\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662679 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpvb\" (UniqueName: \"kubernetes.io/projected/f97b6736-c178-4178-b21b-abeb67027c36-kube-api-access-ljpvb\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662690 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662699 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98002580-e0a7-49b9-9258-222fd6901e29-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.662709 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzzjs\" (UniqueName: \"kubernetes.io/projected/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-kube-api-access-zzzjs\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.672300 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" (UID: "ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:32 crc kubenswrapper[5004]: I1203 14:12:32.764346 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.066545 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" event={"ID":"cca7b643-a679-4b89-b42d-a18c552a737b","Type":"ContainerStarted","Data":"79e521857a97db5e9f528da8a0ab015bc46b6ceb8c79a1ab24699e756c9ba8e4"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.066978 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" event={"ID":"cca7b643-a679-4b89-b42d-a18c552a737b","Type":"ContainerStarted","Data":"9b5eb20d4e841ea3b108b04b8026cc7067f95bdbea20e625968c9c449e26a6f9"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.067015 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.072044 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.074418 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qrsqv" event={"ID":"e9517257-d9b4-480b-bc6f-6424577ef33b","Type":"ContainerDied","Data":"a7d793d0ec92fc434715ad94fed0e4c7bb285031fbbf3ba31bf93767d4290f08"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.074496 5004 scope.go:117] "RemoveContainer" containerID="93a9c33f3487a8e4fb70ec9c101c323f058deb598d7dfa0b6ea1339df89e75b1" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.074638 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qrsqv" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.078456 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5tn4" event={"ID":"f97b6736-c178-4178-b21b-abeb67027c36","Type":"ContainerDied","Data":"af1448fe11c00ba1c13ee72a28656eddd5718c1dc1d78e55f3c37c95057bc6cd"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.078480 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5tn4" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.090721 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.090734 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9mslz" event={"ID":"98002580-e0a7-49b9-9258-222fd6901e29","Type":"ContainerDied","Data":"0379b10d5f11b7b16fecd08cb2440cb2aaea66382f879e784009d83ae9e4428e"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.094612 5004 scope.go:117] "RemoveContainer" containerID="8a2cd742b3cb051fcf14ccda3b704c9f82e695d5d27c62cdaa12cbff5e341508" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.096662 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9m4fr" event={"ID":"ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd","Type":"ContainerDied","Data":"0119244b574cd1fdbaa37ee83b4ef46d9afac400c4f5528522516b929ca18b4e"} Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.096793 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9m4fr" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.100269 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-x2qkd" podStartSLOduration=2.100250112 podStartE2EDuration="2.100250112s" podCreationTimestamp="2025-12-03 14:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:12:33.099523871 +0000 UTC m=+365.848494107" watchObservedRunningTime="2025-12-03 14:12:33.100250112 +0000 UTC m=+365.849220348" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.133241 5004 scope.go:117] "RemoveContainer" containerID="1ec4d41556278b6ad5820b2ea30232ada8f95c248108f0de0f5d14a7cae3b4ed" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.148230 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.152088 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qrsqv"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.183076 5004 scope.go:117] "RemoveContainer" containerID="39c0fb8adab88fb69def818b6a964b82e3aa8314037fa7f20b937465db21df33" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.205759 5004 scope.go:117] "RemoveContainer" containerID="619051eb2811045d647378ee3b51956eccfbf3c2d9794ec25886a2d8e2ce8c62" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.208775 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.214352 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f5tn4"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.219942 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.224349 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9m4fr"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.227125 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.230315 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9mslz"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.232328 5004 scope.go:117] "RemoveContainer" containerID="9cc9bf17035c964f1ce5e7e38cc9890b6d66bf00a82398714324b0f2b2b4e3c6" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.246019 5004 scope.go:117] "RemoveContainer" containerID="7e3124406a1abeb65441c5f84b0fcc7d478e4f016dff6720317a3a4db5c35091" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.260078 5004 scope.go:117] "RemoveContainer" containerID="6766784ce9d042e70b4e6ffd2cfd652010879144341f3a67b2fc0265d2e9e3ad" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.270702 5004 scope.go:117] "RemoveContainer" containerID="58844bbedea68b014ccd5fb026a667cabc2f1e2388ddbc3941e783a46e573d41" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.283459 5004 scope.go:117] "RemoveContainer" containerID="7c13d9fa9a1f05d3c5e42c2b815197bae9691fa30941a1df3377ccb8a6339ec6" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.621190 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" path="/var/lib/kubelet/pods/098255d0-cc88-4fba-bbff-b4427d1dac07/volumes" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.623051 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98002580-e0a7-49b9-9258-222fd6901e29" path="/var/lib/kubelet/pods/98002580-e0a7-49b9-9258-222fd6901e29/volumes" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.623639 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" path="/var/lib/kubelet/pods/e9517257-d9b4-480b-bc6f-6424577ef33b/volumes" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.624793 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" path="/var/lib/kubelet/pods/ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd/volumes" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.625453 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f97b6736-c178-4178-b21b-abeb67027c36" path="/var/lib/kubelet/pods/f97b6736-c178-4178-b21b-abeb67027c36/volumes" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829441 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rkvg7"] Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829645 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829656 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829666 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829672 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829683 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829689 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829698 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829703 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829710 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829716 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829724 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829730 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829740 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829746 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829754 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829760 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829768 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829773 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829780 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829786 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829796 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829801 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="extract-utilities" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829808 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829813 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="extract-content" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829822 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829827 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: E1203 14:12:33.829836 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829841 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829976 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97b6736-c178-4178-b21b-abeb67027c36" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829990 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="098255d0-cc88-4fba-bbff-b4427d1dac07" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.829999 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.830006 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="98002580-e0a7-49b9-9258-222fd6901e29" containerName="marketplace-operator" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.830014 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9517257-d9b4-480b-bc6f-6424577ef33b" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.830022 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef48557c-8dce-4e3d-8c5f-6ebc1591fdbd" containerName="registry-server" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.830703 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.837383 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.876559 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkvg7"] Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.882123 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-catalog-content\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.882182 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7ht\" (UniqueName: \"kubernetes.io/projected/80fbd159-953d-4ede-956d-d40239fae0f0-kube-api-access-vl7ht\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.882211 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-utilities\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.983408 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-catalog-content\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.983671 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7ht\" (UniqueName: \"kubernetes.io/projected/80fbd159-953d-4ede-956d-d40239fae0f0-kube-api-access-vl7ht\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.983777 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-utilities\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.983839 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-catalog-content\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:33 crc kubenswrapper[5004]: I1203 14:12:33.984152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80fbd159-953d-4ede-956d-d40239fae0f0-utilities\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.004291 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7ht\" (UniqueName: \"kubernetes.io/projected/80fbd159-953d-4ede-956d-d40239fae0f0-kube-api-access-vl7ht\") pod \"certified-operators-rkvg7\" (UID: \"80fbd159-953d-4ede-956d-d40239fae0f0\") " pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.029546 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2mkgc"] Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.030754 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.032662 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.040466 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mkgc"] Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.085102 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-catalog-content\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.085220 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5g7\" (UniqueName: \"kubernetes.io/projected/00273e2c-88dd-479a-a5e1-7791a7d0cb30-kube-api-access-8m5g7\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.085267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-utilities\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.186519 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-utilities\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.186973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-catalog-content\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.187169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5g7\" (UniqueName: \"kubernetes.io/projected/00273e2c-88dd-479a-a5e1-7791a7d0cb30-kube-api-access-8m5g7\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.187462 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-catalog-content\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.187459 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00273e2c-88dd-479a-a5e1-7791a7d0cb30-utilities\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.187943 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.206790 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5g7\" (UniqueName: \"kubernetes.io/projected/00273e2c-88dd-479a-a5e1-7791a7d0cb30-kube-api-access-8m5g7\") pod \"community-operators-2mkgc\" (UID: \"00273e2c-88dd-479a-a5e1-7791a7d0cb30\") " pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.351167 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.362964 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkvg7"] Dec 03 14:12:34 crc kubenswrapper[5004]: I1203 14:12:34.575202 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mkgc"] Dec 03 14:12:34 crc kubenswrapper[5004]: W1203 14:12:34.579822 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00273e2c_88dd_479a_a5e1_7791a7d0cb30.slice/crio-1bfc4a888e8bf6a99f01d789af8d08c91901e3b646fa0a1dddc9e2002415583f WatchSource:0}: Error finding container 1bfc4a888e8bf6a99f01d789af8d08c91901e3b646fa0a1dddc9e2002415583f: Status 404 returned error can't find the container with id 1bfc4a888e8bf6a99f01d789af8d08c91901e3b646fa0a1dddc9e2002415583f Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.113177 5004 generic.go:334] "Generic (PLEG): container finished" podID="00273e2c-88dd-479a-a5e1-7791a7d0cb30" containerID="6b0cdbcbdba3a7a229d353b9b4e74e64abcb031cec9d6b3e50254619b339eacd" exitCode=0 Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.113253 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mkgc" event={"ID":"00273e2c-88dd-479a-a5e1-7791a7d0cb30","Type":"ContainerDied","Data":"6b0cdbcbdba3a7a229d353b9b4e74e64abcb031cec9d6b3e50254619b339eacd"} Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.113285 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mkgc" event={"ID":"00273e2c-88dd-479a-a5e1-7791a7d0cb30","Type":"ContainerStarted","Data":"1bfc4a888e8bf6a99f01d789af8d08c91901e3b646fa0a1dddc9e2002415583f"} Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.114582 5004 generic.go:334] "Generic (PLEG): container finished" podID="80fbd159-953d-4ede-956d-d40239fae0f0" containerID="93a95dc4ee96c60aadc02d1afbbc7b827a7f1d90b6107abf2797674e7e969ab1" exitCode=0 Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.114645 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkvg7" event={"ID":"80fbd159-953d-4ede-956d-d40239fae0f0","Type":"ContainerDied","Data":"93a95dc4ee96c60aadc02d1afbbc7b827a7f1d90b6107abf2797674e7e969ab1"} Dec 03 14:12:35 crc kubenswrapper[5004]: I1203 14:12:35.114667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkvg7" event={"ID":"80fbd159-953d-4ede-956d-d40239fae0f0","Type":"ContainerStarted","Data":"83c4527364b33160d42f468d9b24b7e391825672dca8277a2b529d5d29b3dfd0"} Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.236381 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g99tq"] Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.241138 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.242025 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g99tq"] Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.245654 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.316888 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-utilities\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.317666 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64kr\" (UniqueName: \"kubernetes.io/projected/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-kube-api-access-v64kr\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.317747 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-catalog-content\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.418568 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64kr\" (UniqueName: \"kubernetes.io/projected/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-kube-api-access-v64kr\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.418616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-catalog-content\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.418671 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-utilities\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.419915 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-catalog-content\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.420053 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-utilities\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.425418 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q9qzr"] Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.426342 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.430397 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.436414 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q9qzr"] Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.442344 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64kr\" (UniqueName: \"kubernetes.io/projected/8a6b3e02-1dfc-4967-809d-9bc9a2176fd4-kube-api-access-v64kr\") pod \"redhat-marketplace-g99tq\" (UID: \"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4\") " pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.520374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztw2\" (UniqueName: \"kubernetes.io/projected/5fd4fd02-ca91-407b-8558-9a0250a7851c-kube-api-access-hztw2\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.520453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-catalog-content\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.520489 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-utilities\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.606662 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.620997 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztw2\" (UniqueName: \"kubernetes.io/projected/5fd4fd02-ca91-407b-8558-9a0250a7851c-kube-api-access-hztw2\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.621064 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-catalog-content\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.621096 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-utilities\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.621510 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-utilities\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.621992 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd4fd02-ca91-407b-8558-9a0250a7851c-catalog-content\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.641649 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztw2\" (UniqueName: \"kubernetes.io/projected/5fd4fd02-ca91-407b-8558-9a0250a7851c-kube-api-access-hztw2\") pod \"redhat-operators-q9qzr\" (UID: \"5fd4fd02-ca91-407b-8558-9a0250a7851c\") " pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.763355 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:36 crc kubenswrapper[5004]: I1203 14:12:36.824241 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g99tq"] Dec 03 14:12:36 crc kubenswrapper[5004]: W1203 14:12:36.836044 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6b3e02_1dfc_4967_809d_9bc9a2176fd4.slice/crio-857b3451e50ca1126bc7c208e8b9fdf32a61ca229e1c6efe7364ba734598d093 WatchSource:0}: Error finding container 857b3451e50ca1126bc7c208e8b9fdf32a61ca229e1c6efe7364ba734598d093: Status 404 returned error can't find the container with id 857b3451e50ca1126bc7c208e8b9fdf32a61ca229e1c6efe7364ba734598d093 Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.129324 5004 generic.go:334] "Generic (PLEG): container finished" podID="80fbd159-953d-4ede-956d-d40239fae0f0" containerID="433e1fe04e55d5c6ed50d4172e63f91101329d18af822b4708a261bbd6b615d0" exitCode=0 Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.129362 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkvg7" event={"ID":"80fbd159-953d-4ede-956d-d40239fae0f0","Type":"ContainerDied","Data":"433e1fe04e55d5c6ed50d4172e63f91101329d18af822b4708a261bbd6b615d0"} Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.133412 5004 generic.go:334] "Generic (PLEG): container finished" podID="8a6b3e02-1dfc-4967-809d-9bc9a2176fd4" containerID="d75928afd2b9e047efd542b4a7953ff427b7e0ad3591910c6596fd77facff0ae" exitCode=0 Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.134731 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g99tq" event={"ID":"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4","Type":"ContainerDied","Data":"d75928afd2b9e047efd542b4a7953ff427b7e0ad3591910c6596fd77facff0ae"} Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.134766 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g99tq" event={"ID":"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4","Type":"ContainerStarted","Data":"857b3451e50ca1126bc7c208e8b9fdf32a61ca229e1c6efe7364ba734598d093"} Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.144999 5004 generic.go:334] "Generic (PLEG): container finished" podID="00273e2c-88dd-479a-a5e1-7791a7d0cb30" containerID="303339f83df3e6c69970f49118ca0c1b3082e66c4ada7c85493d2d335c3060c2" exitCode=0 Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.145061 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mkgc" event={"ID":"00273e2c-88dd-479a-a5e1-7791a7d0cb30","Type":"ContainerDied","Data":"303339f83df3e6c69970f49118ca0c1b3082e66c4ada7c85493d2d335c3060c2"} Dec 03 14:12:37 crc kubenswrapper[5004]: I1203 14:12:37.172742 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q9qzr"] Dec 03 14:12:37 crc kubenswrapper[5004]: W1203 14:12:37.179391 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd4fd02_ca91_407b_8558_9a0250a7851c.slice/crio-0760a03520a8f9cbb730b3d89496e34e6d28e2f57fdccab68a02525a3a39ea37 WatchSource:0}: Error finding container 0760a03520a8f9cbb730b3d89496e34e6d28e2f57fdccab68a02525a3a39ea37: Status 404 returned error can't find the container with id 0760a03520a8f9cbb730b3d89496e34e6d28e2f57fdccab68a02525a3a39ea37 Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.153199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mkgc" event={"ID":"00273e2c-88dd-479a-a5e1-7791a7d0cb30","Type":"ContainerStarted","Data":"9c4787607788559e8dd3ca17daedc72cc102780e842f327e82ff7ad333856040"} Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.156985 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkvg7" event={"ID":"80fbd159-953d-4ede-956d-d40239fae0f0","Type":"ContainerStarted","Data":"0669b1ccedfbee2adf42f6ef6f5211b5976a3ab32dcaa5c8a97b4dc0aca29e32"} Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.160473 5004 generic.go:334] "Generic (PLEG): container finished" podID="5fd4fd02-ca91-407b-8558-9a0250a7851c" containerID="9f341ec6176d38a6a34d2c45ed49fb968945a2525eb9d6027a8180040a45588b" exitCode=0 Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.160527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9qzr" event={"ID":"5fd4fd02-ca91-407b-8558-9a0250a7851c","Type":"ContainerDied","Data":"9f341ec6176d38a6a34d2c45ed49fb968945a2525eb9d6027a8180040a45588b"} Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.160557 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9qzr" event={"ID":"5fd4fd02-ca91-407b-8558-9a0250a7851c","Type":"ContainerStarted","Data":"0760a03520a8f9cbb730b3d89496e34e6d28e2f57fdccab68a02525a3a39ea37"} Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.174743 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2mkgc" podStartSLOduration=1.461421243 podStartE2EDuration="4.17472586s" podCreationTimestamp="2025-12-03 14:12:34 +0000 UTC" firstStartedPulling="2025-12-03 14:12:35.115315982 +0000 UTC m=+367.864286218" lastFinishedPulling="2025-12-03 14:12:37.828620599 +0000 UTC m=+370.577590835" observedRunningTime="2025-12-03 14:12:38.171708002 +0000 UTC m=+370.920678238" watchObservedRunningTime="2025-12-03 14:12:38.17472586 +0000 UTC m=+370.923696106" Dec 03 14:12:38 crc kubenswrapper[5004]: I1203 14:12:38.190669 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rkvg7" podStartSLOduration=2.7371926650000002 podStartE2EDuration="5.190652184s" podCreationTimestamp="2025-12-03 14:12:33 +0000 UTC" firstStartedPulling="2025-12-03 14:12:35.116576039 +0000 UTC m=+367.865546275" lastFinishedPulling="2025-12-03 14:12:37.570035548 +0000 UTC m=+370.319005794" observedRunningTime="2025-12-03 14:12:38.187514922 +0000 UTC m=+370.936485158" watchObservedRunningTime="2025-12-03 14:12:38.190652184 +0000 UTC m=+370.939622410" Dec 03 14:12:39 crc kubenswrapper[5004]: I1203 14:12:39.166788 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9qzr" event={"ID":"5fd4fd02-ca91-407b-8558-9a0250a7851c","Type":"ContainerStarted","Data":"b23b6b6045b3fba107b734273ca1149c89ed1ab89da37f8f5ad9c1f7c1239bf3"} Dec 03 14:12:40 crc kubenswrapper[5004]: I1203 14:12:40.177099 5004 generic.go:334] "Generic (PLEG): container finished" podID="5fd4fd02-ca91-407b-8558-9a0250a7851c" containerID="b23b6b6045b3fba107b734273ca1149c89ed1ab89da37f8f5ad9c1f7c1239bf3" exitCode=0 Dec 03 14:12:40 crc kubenswrapper[5004]: I1203 14:12:40.177272 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9qzr" event={"ID":"5fd4fd02-ca91-407b-8558-9a0250a7851c","Type":"ContainerDied","Data":"b23b6b6045b3fba107b734273ca1149c89ed1ab89da37f8f5ad9c1f7c1239bf3"} Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.189066 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.189725 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.250016 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.285623 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rkvg7" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.352450 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.352509 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:44 crc kubenswrapper[5004]: I1203 14:12:44.384831 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:45 crc kubenswrapper[5004]: I1203 14:12:45.252167 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2mkgc" Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.219670 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9qzr" event={"ID":"5fd4fd02-ca91-407b-8558-9a0250a7851c","Type":"ContainerStarted","Data":"55ae4713622d94d5aab18745133cd31b0fdb72a919b394e5cebd031f888656ad"} Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.226061 5004 generic.go:334] "Generic (PLEG): container finished" podID="8a6b3e02-1dfc-4967-809d-9bc9a2176fd4" containerID="16fc2370dbe436fb936d5db1662100bfcc6de95cb5d4ea535d89f93e755c5ba6" exitCode=0 Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.226133 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g99tq" event={"ID":"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4","Type":"ContainerDied","Data":"16fc2370dbe436fb936d5db1662100bfcc6de95cb5d4ea535d89f93e755c5ba6"} Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.237573 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q9qzr" podStartSLOduration=3.148210543 podStartE2EDuration="10.237551665s" podCreationTimestamp="2025-12-03 14:12:36 +0000 UTC" firstStartedPulling="2025-12-03 14:12:38.162301628 +0000 UTC m=+370.911271864" lastFinishedPulling="2025-12-03 14:12:45.25164275 +0000 UTC m=+378.000612986" observedRunningTime="2025-12-03 14:12:46.236919177 +0000 UTC m=+378.985889423" watchObservedRunningTime="2025-12-03 14:12:46.237551665 +0000 UTC m=+378.986521921" Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.764673 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:46 crc kubenswrapper[5004]: I1203 14:12:46.764721 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:47 crc kubenswrapper[5004]: I1203 14:12:47.235020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g99tq" event={"ID":"8a6b3e02-1dfc-4967-809d-9bc9a2176fd4","Type":"ContainerStarted","Data":"7f7230b93c8cc11fdeb0db862a59ab3a97cccf92e4d789b91679d120aca4ba70"} Dec 03 14:12:47 crc kubenswrapper[5004]: I1203 14:12:47.255325 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g99tq" podStartSLOduration=1.510551405 podStartE2EDuration="11.255307298s" podCreationTimestamp="2025-12-03 14:12:36 +0000 UTC" firstStartedPulling="2025-12-03 14:12:37.135402069 +0000 UTC m=+369.884372305" lastFinishedPulling="2025-12-03 14:12:46.880157962 +0000 UTC m=+379.629128198" observedRunningTime="2025-12-03 14:12:47.251441336 +0000 UTC m=+380.000411582" watchObservedRunningTime="2025-12-03 14:12:47.255307298 +0000 UTC m=+380.004277534" Dec 03 14:12:47 crc kubenswrapper[5004]: I1203 14:12:47.807889 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q9qzr" podUID="5fd4fd02-ca91-407b-8558-9a0250a7851c" containerName="registry-server" probeResult="failure" output=< Dec 03 14:12:47 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 03 14:12:47 crc kubenswrapper[5004]: > Dec 03 14:12:52 crc kubenswrapper[5004]: I1203 14:12:52.823978 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:12:52 crc kubenswrapper[5004]: I1203 14:12:52.824512 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:12:54 crc kubenswrapper[5004]: I1203 14:12:54.984701 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" podUID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" containerName="registry" containerID="cri-o://0d81f8e252698daa20ad35352f41adfb2b5b1bf6cdf2664b6c718b0d0cabb97d" gracePeriod=30 Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.276009 5004 generic.go:334] "Generic (PLEG): container finished" podID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" containerID="0d81f8e252698daa20ad35352f41adfb2b5b1bf6cdf2664b6c718b0d0cabb97d" exitCode=0 Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.276121 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" event={"ID":"af08e33d-fe7e-48e5-a7ae-149d75ef5595","Type":"ContainerDied","Data":"0d81f8e252698daa20ad35352f41adfb2b5b1bf6cdf2664b6c718b0d0cabb97d"} Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.276344 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" event={"ID":"af08e33d-fe7e-48e5-a7ae-149d75ef5595","Type":"ContainerDied","Data":"7f9db382dda0730f92a73eb1d856fe0adc6f404f16854f3b84699648143bce6c"} Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.276366 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9db382dda0730f92a73eb1d856fe0adc6f404f16854f3b84699648143bce6c" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.288915 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472261 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472505 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472582 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472624 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtw5\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472647 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472669 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.472708 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets\") pod \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\" (UID: \"af08e33d-fe7e-48e5-a7ae-149d75ef5595\") " Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.473371 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.473920 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.478432 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.478941 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5" (OuterVolumeSpecName: "kube-api-access-nhtw5") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "kube-api-access-nhtw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.480350 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.480502 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.480809 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.487728 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "af08e33d-fe7e-48e5-a7ae-149d75ef5595" (UID: "af08e33d-fe7e-48e5-a7ae-149d75ef5595"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574311 5004 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574379 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhtw5\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-kube-api-access-nhtw5\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574393 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af08e33d-fe7e-48e5-a7ae-149d75ef5595-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574402 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574415 5004 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af08e33d-fe7e-48e5-a7ae-149d75ef5595-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574425 5004 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af08e33d-fe7e-48e5-a7ae-149d75ef5595-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:55 crc kubenswrapper[5004]: I1203 14:12:55.574435 5004 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af08e33d-fe7e-48e5-a7ae-149d75ef5595-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.280740 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bz6x2" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.304766 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.307881 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bz6x2"] Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.607914 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.608311 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.657127 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.821282 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:56 crc kubenswrapper[5004]: I1203 14:12:56.859285 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q9qzr" Dec 03 14:12:57 crc kubenswrapper[5004]: I1203 14:12:57.350072 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g99tq" Dec 03 14:12:57 crc kubenswrapper[5004]: I1203 14:12:57.619807 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" path="/var/lib/kubelet/pods/af08e33d-fe7e-48e5-a7ae-149d75ef5595/volumes" Dec 03 14:13:22 crc kubenswrapper[5004]: I1203 14:13:22.824455 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:13:22 crc kubenswrapper[5004]: I1203 14:13:22.825209 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:13:22 crc kubenswrapper[5004]: I1203 14:13:22.825313 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:13:22 crc kubenswrapper[5004]: I1203 14:13:22.826188 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:13:22 crc kubenswrapper[5004]: I1203 14:13:22.826278 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46" gracePeriod=600 Dec 03 14:13:23 crc kubenswrapper[5004]: I1203 14:13:23.434049 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46" exitCode=0 Dec 03 14:13:23 crc kubenswrapper[5004]: I1203 14:13:23.434139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46"} Dec 03 14:13:23 crc kubenswrapper[5004]: I1203 14:13:23.434485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85"} Dec 03 14:13:23 crc kubenswrapper[5004]: I1203 14:13:23.434512 5004 scope.go:117] "RemoveContainer" containerID="7be40361cf745efff50d3f2a13fe2391093dcf4047d78fcd854728dbc91d5667" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.161694 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4"] Dec 03 14:15:00 crc kubenswrapper[5004]: E1203 14:15:00.162462 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" containerName="registry" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.162475 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" containerName="registry" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.162586 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="af08e33d-fe7e-48e5-a7ae-149d75ef5595" containerName="registry" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.162999 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.164792 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.164819 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.172104 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4"] Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.273740 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcf6\" (UniqueName: \"kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.273807 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.273829 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.375422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcf6\" (UniqueName: \"kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.375487 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.375516 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.376317 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.385763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.393007 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcf6\" (UniqueName: \"kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6\") pod \"collect-profiles-29412855-gc7c4\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.482262 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.678146 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4"] Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.994081 5004 generic.go:334] "Generic (PLEG): container finished" podID="aadc310e-caab-475e-9900-c376fd4f5371" containerID="a207fe752e9aa222fa7939c9c7e70388018ea970ee762f607fc5dd498b0ddbb8" exitCode=0 Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.994272 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" event={"ID":"aadc310e-caab-475e-9900-c376fd4f5371","Type":"ContainerDied","Data":"a207fe752e9aa222fa7939c9c7e70388018ea970ee762f607fc5dd498b0ddbb8"} Dec 03 14:15:00 crc kubenswrapper[5004]: I1203 14:15:00.994401 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" event={"ID":"aadc310e-caab-475e-9900-c376fd4f5371","Type":"ContainerStarted","Data":"6ec9fbe662235f423d28c7095d72bc8f1de2b1c7bd617add483381cb5bc53e80"} Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.213266 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.298519 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume\") pod \"aadc310e-caab-475e-9900-c376fd4f5371\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.298661 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume\") pod \"aadc310e-caab-475e-9900-c376fd4f5371\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.298725 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwcf6\" (UniqueName: \"kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6\") pod \"aadc310e-caab-475e-9900-c376fd4f5371\" (UID: \"aadc310e-caab-475e-9900-c376fd4f5371\") " Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.299367 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume" (OuterVolumeSpecName: "config-volume") pod "aadc310e-caab-475e-9900-c376fd4f5371" (UID: "aadc310e-caab-475e-9900-c376fd4f5371"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.304171 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aadc310e-caab-475e-9900-c376fd4f5371" (UID: "aadc310e-caab-475e-9900-c376fd4f5371"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.304350 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6" (OuterVolumeSpecName: "kube-api-access-dwcf6") pod "aadc310e-caab-475e-9900-c376fd4f5371" (UID: "aadc310e-caab-475e-9900-c376fd4f5371"). InnerVolumeSpecName "kube-api-access-dwcf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.400742 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadc310e-caab-475e-9900-c376fd4f5371-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.400828 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadc310e-caab-475e-9900-c376fd4f5371-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:02 crc kubenswrapper[5004]: I1203 14:15:02.400841 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwcf6\" (UniqueName: \"kubernetes.io/projected/aadc310e-caab-475e-9900-c376fd4f5371-kube-api-access-dwcf6\") on node \"crc\" DevicePath \"\"" Dec 03 14:15:03 crc kubenswrapper[5004]: I1203 14:15:03.011371 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" event={"ID":"aadc310e-caab-475e-9900-c376fd4f5371","Type":"ContainerDied","Data":"6ec9fbe662235f423d28c7095d72bc8f1de2b1c7bd617add483381cb5bc53e80"} Dec 03 14:15:03 crc kubenswrapper[5004]: I1203 14:15:03.011609 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec9fbe662235f423d28c7095d72bc8f1de2b1c7bd617add483381cb5bc53e80" Dec 03 14:15:03 crc kubenswrapper[5004]: I1203 14:15:03.011731 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4" Dec 03 14:15:29 crc kubenswrapper[5004]: I1203 14:15:29.533079 5004 scope.go:117] "RemoveContainer" containerID="1f85a3c0e97d35e002fc86253917a42486b97be6c49fd0ce81d206abe54154b2" Dec 03 14:15:29 crc kubenswrapper[5004]: I1203 14:15:29.550448 5004 scope.go:117] "RemoveContainer" containerID="0d81f8e252698daa20ad35352f41adfb2b5b1bf6cdf2664b6c718b0d0cabb97d" Dec 03 14:15:52 crc kubenswrapper[5004]: I1203 14:15:52.824276 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:15:52 crc kubenswrapper[5004]: I1203 14:15:52.825009 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:16:22 crc kubenswrapper[5004]: I1203 14:16:22.825137 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:16:22 crc kubenswrapper[5004]: I1203 14:16:22.825689 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:16:29 crc kubenswrapper[5004]: I1203 14:16:29.600299 5004 scope.go:117] "RemoveContainer" containerID="b68d45d6514cdf50f2686e09e7fb35806868a00aeb4108c49027943d455e8511" Dec 03 14:16:29 crc kubenswrapper[5004]: I1203 14:16:29.619803 5004 scope.go:117] "RemoveContainer" containerID="42149d0bd11e918117b8b8af77af197a641ecb5e7c90386e57e69fe0f294047c" Dec 03 14:16:52 crc kubenswrapper[5004]: I1203 14:16:52.824154 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:16:52 crc kubenswrapper[5004]: I1203 14:16:52.824777 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:16:52 crc kubenswrapper[5004]: I1203 14:16:52.824829 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:16:52 crc kubenswrapper[5004]: I1203 14:16:52.825422 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:16:52 crc kubenswrapper[5004]: I1203 14:16:52.825492 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85" gracePeriod=600 Dec 03 14:16:53 crc kubenswrapper[5004]: I1203 14:16:53.667708 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85" exitCode=0 Dec 03 14:16:53 crc kubenswrapper[5004]: I1203 14:16:53.667758 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85"} Dec 03 14:16:53 crc kubenswrapper[5004]: I1203 14:16:53.668276 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21"} Dec 03 14:16:53 crc kubenswrapper[5004]: I1203 14:16:53.668309 5004 scope.go:117] "RemoveContainer" containerID="4b92db5e43cff8d0c223ece939ebee953122837d94b1c020554766cde011ab46" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.681291 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hjmk5"] Dec 03 14:18:49 crc kubenswrapper[5004]: E1203 14:18:49.682108 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadc310e-caab-475e-9900-c376fd4f5371" containerName="collect-profiles" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.682125 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadc310e-caab-475e-9900-c376fd4f5371" containerName="collect-profiles" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.682256 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadc310e-caab-475e-9900-c376fd4f5371" containerName="collect-profiles" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.682768 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.685609 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4rv6q"] Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.686392 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4rv6q" Dec 03 14:18:49 crc kubenswrapper[5004]: W1203 14:18:49.695662 5004 reflector.go:561] object-"cert-manager"/"cert-manager-dockercfg-tx5kc": failed to list *v1.Secret: secrets "cert-manager-dockercfg-tx5kc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Dec 03 14:18:49 crc kubenswrapper[5004]: E1203 14:18:49.695900 5004 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"cert-manager-dockercfg-tx5kc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-manager-dockercfg-tx5kc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.695760 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.695831 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zc6fv" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.697539 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.706073 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hjmk5"] Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.729894 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4rv6q"] Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.752084 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-swvh8"] Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.752796 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.758657 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-fl5z4" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.771429 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-swvh8"] Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.847738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnqq\" (UniqueName: \"kubernetes.io/projected/6b108801-1198-420f-ab57-dea765daf047-kube-api-access-kvnqq\") pod \"cert-manager-cainjector-7f985d654d-hjmk5\" (UID: \"6b108801-1198-420f-ab57-dea765daf047\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.847828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhpw\" (UniqueName: \"kubernetes.io/projected/a64714f4-8d4b-4101-bf1c-d953cddb3f08-kube-api-access-crhpw\") pod \"cert-manager-webhook-5655c58dd6-swvh8\" (UID: \"a64714f4-8d4b-4101-bf1c-d953cddb3f08\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.847883 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kpcz\" (UniqueName: \"kubernetes.io/projected/0399bdc2-ceca-49e1-a00b-a8685a860ebe-kube-api-access-2kpcz\") pod \"cert-manager-5b446d88c5-4rv6q\" (UID: \"0399bdc2-ceca-49e1-a00b-a8685a860ebe\") " pod="cert-manager/cert-manager-5b446d88c5-4rv6q" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.948659 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crhpw\" (UniqueName: \"kubernetes.io/projected/a64714f4-8d4b-4101-bf1c-d953cddb3f08-kube-api-access-crhpw\") pod \"cert-manager-webhook-5655c58dd6-swvh8\" (UID: \"a64714f4-8d4b-4101-bf1c-d953cddb3f08\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.948727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kpcz\" (UniqueName: \"kubernetes.io/projected/0399bdc2-ceca-49e1-a00b-a8685a860ebe-kube-api-access-2kpcz\") pod \"cert-manager-5b446d88c5-4rv6q\" (UID: \"0399bdc2-ceca-49e1-a00b-a8685a860ebe\") " pod="cert-manager/cert-manager-5b446d88c5-4rv6q" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.948796 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnqq\" (UniqueName: \"kubernetes.io/projected/6b108801-1198-420f-ab57-dea765daf047-kube-api-access-kvnqq\") pod \"cert-manager-cainjector-7f985d654d-hjmk5\" (UID: \"6b108801-1198-420f-ab57-dea765daf047\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.967202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kpcz\" (UniqueName: \"kubernetes.io/projected/0399bdc2-ceca-49e1-a00b-a8685a860ebe-kube-api-access-2kpcz\") pod \"cert-manager-5b446d88c5-4rv6q\" (UID: \"0399bdc2-ceca-49e1-a00b-a8685a860ebe\") " pod="cert-manager/cert-manager-5b446d88c5-4rv6q" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.968928 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhpw\" (UniqueName: \"kubernetes.io/projected/a64714f4-8d4b-4101-bf1c-d953cddb3f08-kube-api-access-crhpw\") pod \"cert-manager-webhook-5655c58dd6-swvh8\" (UID: \"a64714f4-8d4b-4101-bf1c-d953cddb3f08\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:49 crc kubenswrapper[5004]: I1203 14:18:49.969661 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnqq\" (UniqueName: \"kubernetes.io/projected/6b108801-1198-420f-ab57-dea765daf047-kube-api-access-kvnqq\") pod \"cert-manager-cainjector-7f985d654d-hjmk5\" (UID: \"6b108801-1198-420f-ab57-dea765daf047\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.001830 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.074764 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.189961 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hjmk5"] Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.202125 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.277613 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-swvh8"] Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.885876 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" event={"ID":"6b108801-1198-420f-ab57-dea765daf047","Type":"ContainerStarted","Data":"9d5cd799ffbd645f983b2e36b2e771f8bad374546e4769ddf39370a893f936f6"} Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.886720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" event={"ID":"a64714f4-8d4b-4101-bf1c-d953cddb3f08","Type":"ContainerStarted","Data":"4dd2517cf1af902f6c1358d17c8703b63c643ab6dce0185e2ddcb0c619b4122d"} Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.899798 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tx5kc" Dec 03 14:18:50 crc kubenswrapper[5004]: I1203 14:18:50.904764 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4rv6q" Dec 03 14:18:51 crc kubenswrapper[5004]: I1203 14:18:51.100745 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4rv6q"] Dec 03 14:18:51 crc kubenswrapper[5004]: W1203 14:18:51.113320 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0399bdc2_ceca_49e1_a00b_a8685a860ebe.slice/crio-b68cbd11029295270fe8a18b8b30ea2f0cd377ffc22a9aa397b20ef6093752a1 WatchSource:0}: Error finding container b68cbd11029295270fe8a18b8b30ea2f0cd377ffc22a9aa397b20ef6093752a1: Status 404 returned error can't find the container with id b68cbd11029295270fe8a18b8b30ea2f0cd377ffc22a9aa397b20ef6093752a1 Dec 03 14:18:51 crc kubenswrapper[5004]: I1203 14:18:51.894240 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4rv6q" event={"ID":"0399bdc2-ceca-49e1-a00b-a8685a860ebe","Type":"ContainerStarted","Data":"b68cbd11029295270fe8a18b8b30ea2f0cd377ffc22a9aa397b20ef6093752a1"} Dec 03 14:18:53 crc kubenswrapper[5004]: I1203 14:18:53.910159 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" event={"ID":"6b108801-1198-420f-ab57-dea765daf047","Type":"ContainerStarted","Data":"e502f8db145e9a148eb6203877004a89323c4038be0f615c73e030997eeae298"} Dec 03 14:18:53 crc kubenswrapper[5004]: I1203 14:18:53.929564 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-hjmk5" podStartSLOduration=2.368567373 podStartE2EDuration="4.929544665s" podCreationTimestamp="2025-12-03 14:18:49 +0000 UTC" firstStartedPulling="2025-12-03 14:18:50.200568261 +0000 UTC m=+742.949538497" lastFinishedPulling="2025-12-03 14:18:52.761545553 +0000 UTC m=+745.510515789" observedRunningTime="2025-12-03 14:18:53.927697332 +0000 UTC m=+746.676667588" watchObservedRunningTime="2025-12-03 14:18:53.929544665 +0000 UTC m=+746.678514901" Dec 03 14:18:54 crc kubenswrapper[5004]: I1203 14:18:54.916845 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" event={"ID":"a64714f4-8d4b-4101-bf1c-d953cddb3f08","Type":"ContainerStarted","Data":"5ed028d7fbbf4a9c9cc87fb8cd97812072b2ee0fd6aaae6867a31950f5b80265"} Dec 03 14:18:54 crc kubenswrapper[5004]: I1203 14:18:54.917008 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:18:54 crc kubenswrapper[5004]: I1203 14:18:54.918920 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4rv6q" event={"ID":"0399bdc2-ceca-49e1-a00b-a8685a860ebe","Type":"ContainerStarted","Data":"e366dc8777c18a09e60f260e4521f2aedef6ea6b68059eb7e5efb22369942a92"} Dec 03 14:18:54 crc kubenswrapper[5004]: I1203 14:18:54.930466 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" podStartSLOduration=2.012382526 podStartE2EDuration="5.930446562s" podCreationTimestamp="2025-12-03 14:18:49 +0000 UTC" firstStartedPulling="2025-12-03 14:18:50.285086616 +0000 UTC m=+743.034056852" lastFinishedPulling="2025-12-03 14:18:54.203150652 +0000 UTC m=+746.952120888" observedRunningTime="2025-12-03 14:18:54.927746735 +0000 UTC m=+747.676716971" watchObservedRunningTime="2025-12-03 14:18:54.930446562 +0000 UTC m=+747.679416798" Dec 03 14:18:54 crc kubenswrapper[5004]: I1203 14:18:54.949652 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4rv6q" podStartSLOduration=2.78599591 podStartE2EDuration="5.94963127s" podCreationTimestamp="2025-12-03 14:18:49 +0000 UTC" firstStartedPulling="2025-12-03 14:18:51.115120652 +0000 UTC m=+743.864090888" lastFinishedPulling="2025-12-03 14:18:54.278756012 +0000 UTC m=+747.027726248" observedRunningTime="2025-12-03 14:18:54.946367267 +0000 UTC m=+747.695337513" watchObservedRunningTime="2025-12-03 14:18:54.94963127 +0000 UTC m=+747.698601506" Dec 03 14:18:56 crc kubenswrapper[5004]: I1203 14:18:56.725466 5004 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.079173 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-swvh8" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.234653 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvbnm"] Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235235 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-controller" containerID="cri-o://033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235375 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235429 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-acl-logging" containerID="cri-o://0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235424 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-node" containerID="cri-o://2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235599 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="northd" containerID="cri-o://c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235632 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="sbdb" containerID="cri-o://599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.235932 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="nbdb" containerID="cri-o://6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.261061 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" containerID="cri-o://2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" gracePeriod=30 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.531792 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/3.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.534821 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovn-acl-logging/0.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.535519 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovn-controller/0.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.536266 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.594848 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7fqgh"] Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.595365 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.595474 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.595556 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.595630 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.595745 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.595824 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.595933 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="northd" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596009 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="northd" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596086 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-acl-logging" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596157 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-acl-logging" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596252 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596360 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596442 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="sbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596508 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="sbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596589 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596656 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596737 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-node" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596809 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-node" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.596904 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.596985 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.597060 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="nbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597127 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="nbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.597208 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kubecfg-setup" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597280 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kubecfg-setup" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597485 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597595 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="northd" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597695 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597772 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovn-acl-logging" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597841 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="sbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.597954 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="nbdb" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598027 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-node" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598096 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598167 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598244 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598317 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 14:19:00 crc kubenswrapper[5004]: E1203 14:19:00.598509 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598586 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.598804 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerName="ovnkube-controller" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.600850 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.695876 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.695983 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696012 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696082 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696096 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696098 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696161 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696216 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696268 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzrc\" (UniqueName: \"kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696322 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696345 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696381 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696401 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696423 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696424 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696444 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696465 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696475 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696499 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696538 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696533 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696559 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696578 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696585 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696628 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696650 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin\") pod \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\" (UID: \"78eea523-e8ee-4f41-93b2-6bbfdcdf3371\") " Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696803 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-script-lib\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696833 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-slash\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696918 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-systemd-units\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-netd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696970 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovn-node-metrics-cert\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696993 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-etc-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.696999 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697015 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697030 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log" (OuterVolumeSpecName: "node-log") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697040 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-kubelet\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697050 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket" (OuterVolumeSpecName: "log-socket") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697037 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697066 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697145 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697328 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-node-log\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697092 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697108 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697185 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash" (OuterVolumeSpecName: "host-slash") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697207 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697355 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697470 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-env-overrides\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697454 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697529 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-var-lib-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697563 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-bin\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697646 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-ovn\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697676 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpp7\" (UniqueName: \"kubernetes.io/projected/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-kube-api-access-bgpp7\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697714 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-netns\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697741 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-systemd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697768 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-log-socket\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697795 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-config\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697894 5004 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697915 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697930 5004 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697944 5004 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697959 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697973 5004 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.697987 5004 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698002 5004 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698021 5004 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698035 5004 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698052 5004 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698066 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698080 5004 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698095 5004 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698109 5004 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698125 5004 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.698140 5004 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.709528 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc" (OuterVolumeSpecName: "kube-api-access-gmzrc") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "kube-api-access-gmzrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.709617 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.711111 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "78eea523-e8ee-4f41-93b2-6bbfdcdf3371" (UID: "78eea523-e8ee-4f41-93b2-6bbfdcdf3371"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799344 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-ovn\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799687 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpp7\" (UniqueName: \"kubernetes.io/projected/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-kube-api-access-bgpp7\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799715 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-netns\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799733 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-systemd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799752 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-log-socket\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799784 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-config\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799807 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-script-lib\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-systemd-units\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799842 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-slash\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-netd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-ovn\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799905 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovn-node-metrics-cert\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-kubelet\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799972 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-etc-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799992 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800018 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800038 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-node-log\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800083 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-env-overrides\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800104 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-var-lib-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800102 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-systemd-units\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800158 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-bin\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-bin\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800200 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-systemd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800368 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzrc\" (UniqueName: \"kubernetes.io/projected/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-kube-api-access-gmzrc\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800386 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800399 5004 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78eea523-e8ee-4f41-93b2-6bbfdcdf3371-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.800915 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-log-socket\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801106 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-slash\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-cni-netd\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.799876 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-netns\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801299 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-run-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801340 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-kubelet\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801372 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-etc-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801403 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801540 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-node-log\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801660 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-host-run-ovn-kubernetes\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801840 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-config\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.801908 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-var-lib-openvswitch\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.802197 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovnkube-script-lib\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.802442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-env-overrides\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.804422 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-ovn-node-metrics-cert\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.819209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpp7\" (UniqueName: \"kubernetes.io/projected/4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6-kube-api-access-bgpp7\") pod \"ovnkube-node-7fqgh\" (UID: \"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6\") " pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.918208 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.956198 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/2.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.956731 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/1.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.956891 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff08cd56-3e47-4cd7-98ad-8571f178dc62" containerID="f6b3217cb0590f575d85bd7a577d90b72df97e280035f1545948ccb27a9febb5" exitCode=2 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.957010 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerDied","Data":"f6b3217cb0590f575d85bd7a577d90b72df97e280035f1545948ccb27a9febb5"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.957121 5004 scope.go:117] "RemoveContainer" containerID="70b30e744d805278760f80697a661c0fa1e387df3e420a3b40c382c3cf8fe42a" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.958684 5004 scope.go:117] "RemoveContainer" containerID="f6b3217cb0590f575d85bd7a577d90b72df97e280035f1545948ccb27a9febb5" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.962895 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovnkube-controller/3.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.965056 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovn-acl-logging/0.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.965509 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvbnm_78eea523-e8ee-4f41-93b2-6bbfdcdf3371/ovn-controller/0.log" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.965926 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966029 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966085 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966118 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966129 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966144 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966254 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966371 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966441 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966502 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" exitCode=0 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966589 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" exitCode=143 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966641 5004 generic.go:334] "Generic (PLEG): container finished" podID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" exitCode=143 Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966713 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966789 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.966888 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967005 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967079 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967153 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967209 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967257 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967302 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967351 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967419 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967489 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967579 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967630 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967686 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967773 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967823 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967902 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.967996 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968159 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968238 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968335 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968409 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968480 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968534 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968589 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968639 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968691 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968742 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968790 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.968914 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969030 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969557 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969682 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969772 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969824 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvbnm" event={"ID":"78eea523-e8ee-4f41-93b2-6bbfdcdf3371","Type":"ContainerDied","Data":"c505d925a7e26ca1511514d826505389a5135fea6cf726e7d4e2d4795c21255d"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969899 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.969966 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970017 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970067 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970116 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970163 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970207 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970250 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970293 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.970356 5004 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.994949 5004 scope.go:117] "RemoveContainer" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:00 crc kubenswrapper[5004]: I1203 14:19:00.999799 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvbnm"] Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.009332 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvbnm"] Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.042205 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.065950 5004 scope.go:117] "RemoveContainer" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.079016 5004 scope.go:117] "RemoveContainer" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.095724 5004 scope.go:117] "RemoveContainer" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.150468 5004 scope.go:117] "RemoveContainer" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.168499 5004 scope.go:117] "RemoveContainer" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.181752 5004 scope.go:117] "RemoveContainer" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.194074 5004 scope.go:117] "RemoveContainer" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.207639 5004 scope.go:117] "RemoveContainer" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.226033 5004 scope.go:117] "RemoveContainer" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.226498 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": container with ID starting with 2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062 not found: ID does not exist" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.226563 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} err="failed to get container status \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": rpc error: code = NotFound desc = could not find container \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": container with ID starting with 2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.226764 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.227334 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": container with ID starting with 93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8 not found: ID does not exist" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227365 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} err="failed to get container status \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": rpc error: code = NotFound desc = could not find container \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": container with ID starting with 93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227387 5004 scope.go:117] "RemoveContainer" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.227614 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": container with ID starting with 599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a not found: ID does not exist" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227631 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} err="failed to get container status \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": rpc error: code = NotFound desc = could not find container \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": container with ID starting with 599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227643 5004 scope.go:117] "RemoveContainer" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.227942 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": container with ID starting with 6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa not found: ID does not exist" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227958 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} err="failed to get container status \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": rpc error: code = NotFound desc = could not find container \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": container with ID starting with 6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.227970 5004 scope.go:117] "RemoveContainer" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.228245 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": container with ID starting with c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538 not found: ID does not exist" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.228261 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} err="failed to get container status \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": rpc error: code = NotFound desc = could not find container \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": container with ID starting with c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.228274 5004 scope.go:117] "RemoveContainer" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.229293 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": container with ID starting with 62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885 not found: ID does not exist" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229313 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} err="failed to get container status \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": rpc error: code = NotFound desc = could not find container \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": container with ID starting with 62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229363 5004 scope.go:117] "RemoveContainer" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.229631 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": container with ID starting with 2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e not found: ID does not exist" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229650 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} err="failed to get container status \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": rpc error: code = NotFound desc = could not find container \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": container with ID starting with 2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229663 5004 scope.go:117] "RemoveContainer" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.229894 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": container with ID starting with 0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69 not found: ID does not exist" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229914 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} err="failed to get container status \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": rpc error: code = NotFound desc = could not find container \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": container with ID starting with 0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.229925 5004 scope.go:117] "RemoveContainer" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.230103 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": container with ID starting with 033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810 not found: ID does not exist" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230125 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} err="failed to get container status \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": rpc error: code = NotFound desc = could not find container \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": container with ID starting with 033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230138 5004 scope.go:117] "RemoveContainer" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: E1203 14:19:01.230383 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": container with ID starting with ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17 not found: ID does not exist" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230403 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} err="failed to get container status \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": rpc error: code = NotFound desc = could not find container \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": container with ID starting with ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230415 5004 scope.go:117] "RemoveContainer" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230652 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} err="failed to get container status \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": rpc error: code = NotFound desc = could not find container \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": container with ID starting with 2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230669 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230889 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} err="failed to get container status \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": rpc error: code = NotFound desc = could not find container \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": container with ID starting with 93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.230917 5004 scope.go:117] "RemoveContainer" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.231458 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} err="failed to get container status \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": rpc error: code = NotFound desc = could not find container \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": container with ID starting with 599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.231482 5004 scope.go:117] "RemoveContainer" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232042 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} err="failed to get container status \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": rpc error: code = NotFound desc = could not find container \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": container with ID starting with 6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232110 5004 scope.go:117] "RemoveContainer" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232580 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} err="failed to get container status \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": rpc error: code = NotFound desc = could not find container \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": container with ID starting with c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232604 5004 scope.go:117] "RemoveContainer" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232820 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} err="failed to get container status \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": rpc error: code = NotFound desc = could not find container \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": container with ID starting with 62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.232844 5004 scope.go:117] "RemoveContainer" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233189 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} err="failed to get container status \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": rpc error: code = NotFound desc = could not find container \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": container with ID starting with 2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233213 5004 scope.go:117] "RemoveContainer" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233465 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} err="failed to get container status \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": rpc error: code = NotFound desc = could not find container \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": container with ID starting with 0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233488 5004 scope.go:117] "RemoveContainer" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233770 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} err="failed to get container status \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": rpc error: code = NotFound desc = could not find container \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": container with ID starting with 033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.233792 5004 scope.go:117] "RemoveContainer" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234085 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} err="failed to get container status \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": rpc error: code = NotFound desc = could not find container \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": container with ID starting with ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234111 5004 scope.go:117] "RemoveContainer" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234329 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} err="failed to get container status \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": rpc error: code = NotFound desc = could not find container \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": container with ID starting with 2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234354 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234577 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} err="failed to get container status \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": rpc error: code = NotFound desc = could not find container \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": container with ID starting with 93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234600 5004 scope.go:117] "RemoveContainer" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234892 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} err="failed to get container status \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": rpc error: code = NotFound desc = could not find container \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": container with ID starting with 599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.234916 5004 scope.go:117] "RemoveContainer" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.235132 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} err="failed to get container status \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": rpc error: code = NotFound desc = could not find container \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": container with ID starting with 6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.235158 5004 scope.go:117] "RemoveContainer" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236102 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} err="failed to get container status \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": rpc error: code = NotFound desc = could not find container \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": container with ID starting with c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236129 5004 scope.go:117] "RemoveContainer" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236378 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} err="failed to get container status \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": rpc error: code = NotFound desc = could not find container \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": container with ID starting with 62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236404 5004 scope.go:117] "RemoveContainer" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236760 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} err="failed to get container status \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": rpc error: code = NotFound desc = could not find container \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": container with ID starting with 2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.236785 5004 scope.go:117] "RemoveContainer" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.237091 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} err="failed to get container status \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": rpc error: code = NotFound desc = could not find container \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": container with ID starting with 0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.237116 5004 scope.go:117] "RemoveContainer" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.237690 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} err="failed to get container status \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": rpc error: code = NotFound desc = could not find container \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": container with ID starting with 033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.237715 5004 scope.go:117] "RemoveContainer" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238024 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} err="failed to get container status \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": rpc error: code = NotFound desc = could not find container \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": container with ID starting with ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238043 5004 scope.go:117] "RemoveContainer" containerID="2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238266 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062"} err="failed to get container status \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": rpc error: code = NotFound desc = could not find container \"2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062\": container with ID starting with 2579168e5d54295d10858bafaac0dd4bca9309d3ccf4625e272e779f08f3d062 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238293 5004 scope.go:117] "RemoveContainer" containerID="93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238543 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8"} err="failed to get container status \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": rpc error: code = NotFound desc = could not find container \"93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8\": container with ID starting with 93a4b791812ab08791a52ade082cdbd69817581bbe8fafaaaf0e3a1bb7b4e8e8 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238568 5004 scope.go:117] "RemoveContainer" containerID="599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238834 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a"} err="failed to get container status \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": rpc error: code = NotFound desc = could not find container \"599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a\": container with ID starting with 599e5f6e4d8a40b85b8224eff39e86644158e8753937edf2b985721765369d6a not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.238877 5004 scope.go:117] "RemoveContainer" containerID="6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239107 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa"} err="failed to get container status \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": rpc error: code = NotFound desc = could not find container \"6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa\": container with ID starting with 6d0828ec97beebe0c867a8b0638af4d84059f7480c605f35165fa774dc3f8faa not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239135 5004 scope.go:117] "RemoveContainer" containerID="c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239457 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538"} err="failed to get container status \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": rpc error: code = NotFound desc = could not find container \"c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538\": container with ID starting with c1f0302cf78f58146ec1532e5a62725887222a8aad8c977a5752f79a7c795538 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239482 5004 scope.go:117] "RemoveContainer" containerID="62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239682 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885"} err="failed to get container status \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": rpc error: code = NotFound desc = could not find container \"62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885\": container with ID starting with 62210f2dab120663f0dfda51c068360f9f394325565969d2265f079ed222f885 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.239709 5004 scope.go:117] "RemoveContainer" containerID="2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240060 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e"} err="failed to get container status \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": rpc error: code = NotFound desc = could not find container \"2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e\": container with ID starting with 2ac18babb865a54f64ca84e4aeb0ac585e6466d925b1b15e458d800bb7d7659e not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240089 5004 scope.go:117] "RemoveContainer" containerID="0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240308 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69"} err="failed to get container status \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": rpc error: code = NotFound desc = could not find container \"0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69\": container with ID starting with 0debcfb82c82ce4818080096849f07e0d159aa5dd3ffc7a267c6b53d84c50a69 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240337 5004 scope.go:117] "RemoveContainer" containerID="033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240705 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810"} err="failed to get container status \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": rpc error: code = NotFound desc = could not find container \"033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810\": container with ID starting with 033a6b69c78faa59e1d78988a01738a02358af388ed7e16a9544bd18009df810 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240730 5004 scope.go:117] "RemoveContainer" containerID="ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.240950 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17"} err="failed to get container status \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": rpc error: code = NotFound desc = could not find container \"ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17\": container with ID starting with ca7965cef0241ca17888fd011a9e86385aa2f4fe84832dbcd44005af1f8dda17 not found: ID does not exist" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.627520 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78eea523-e8ee-4f41-93b2-6bbfdcdf3371" path="/var/lib/kubelet/pods/78eea523-e8ee-4f41-93b2-6bbfdcdf3371/volumes" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.974942 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6" containerID="a954bb7f77a6ab7aa41613588fbb7ed72ae298cbc460a278c8c3abbf445aed1e" exitCode=0 Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.975017 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerDied","Data":"a954bb7f77a6ab7aa41613588fbb7ed72ae298cbc460a278c8c3abbf445aed1e"} Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.975071 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"e6a10e0c5a276e6f1f48065f861075274f8317753ea33c26372ca74b9d8253da"} Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.976710 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6kp7_ff08cd56-3e47-4cd7-98ad-8571f178dc62/kube-multus/2.log" Dec 03 14:19:01 crc kubenswrapper[5004]: I1203 14:19:01.976758 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6kp7" event={"ID":"ff08cd56-3e47-4cd7-98ad-8571f178dc62","Type":"ContainerStarted","Data":"90b1810c59293e0e4fd5e799ff10d6746d1f9aaa6372a83f67a811f54fead16e"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.993536 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"25cd17fdace5d8c29b733c7a58fef909a44c271ff8a9a1971f555fa49102005a"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.994180 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"4fe5b4430f4911ea0183cd6740cda44c14757cef364c0b5608ee6d8a5ac8734c"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.994196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"9bafe270a641a79070d16f10308556d24cb845423f790460193961aaa6c60e82"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.994210 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"5a3dba2190b223d0e54838ab43a46a7685f82f4e941916f09ca6232fc8971a6b"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.994228 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"56eba512bc6c54e0e29001b4f45d7e3b09f134f3958accffc4c4ddef94951345"} Dec 03 14:19:02 crc kubenswrapper[5004]: I1203 14:19:02.994240 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"9966bef898c242a0276bea88bbac395f716eb70d96b5fdc27f5c8b8dda80d27a"} Dec 03 14:19:05 crc kubenswrapper[5004]: I1203 14:19:05.006630 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"9dd161d65b573fbbe4053cf141c3e50ebc96beac3fc74354dcab8fad23856783"} Dec 03 14:19:08 crc kubenswrapper[5004]: I1203 14:19:08.027722 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" event={"ID":"4e4b906d-ffe9-4bd3-b82c-b5eeb3ce96b6","Type":"ContainerStarted","Data":"9762ca1f21b79b441a2843865048f69654c45226bb719ff8696b4a7b4837fa6c"} Dec 03 14:19:09 crc kubenswrapper[5004]: I1203 14:19:09.032053 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:09 crc kubenswrapper[5004]: I1203 14:19:09.032414 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:09 crc kubenswrapper[5004]: I1203 14:19:09.071916 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" podStartSLOduration=9.071899054 podStartE2EDuration="9.071899054s" podCreationTimestamp="2025-12-03 14:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:19:09.07037831 +0000 UTC m=+761.819348546" watchObservedRunningTime="2025-12-03 14:19:09.071899054 +0000 UTC m=+761.820869290" Dec 03 14:19:09 crc kubenswrapper[5004]: I1203 14:19:09.073075 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:10 crc kubenswrapper[5004]: I1203 14:19:10.047027 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:10 crc kubenswrapper[5004]: I1203 14:19:10.076291 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:12 crc kubenswrapper[5004]: I1203 14:19:12.090184 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7fqgh" Dec 03 14:19:22 crc kubenswrapper[5004]: I1203 14:19:22.823956 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:19:22 crc kubenswrapper[5004]: I1203 14:19:22.824553 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.473094 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm"] Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.474520 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.476091 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.481082 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm"] Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.501215 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.501268 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.501339 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxbvs\" (UniqueName: \"kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.602976 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.603100 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.603239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxbvs\" (UniqueName: \"kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.603544 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.603732 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.636679 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxbvs\" (UniqueName: \"kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:39 crc kubenswrapper[5004]: I1203 14:19:39.789032 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:40 crc kubenswrapper[5004]: I1203 14:19:40.024703 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm"] Dec 03 14:19:40 crc kubenswrapper[5004]: I1203 14:19:40.211355 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerStarted","Data":"8640c2b8b74d89ad880f98175496431bfcd64138f551378830fb8c2bed867394"} Dec 03 14:19:40 crc kubenswrapper[5004]: I1203 14:19:40.211664 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerStarted","Data":"873d8dcaa60e1cc0b8b384b6c9f925480febb0fd79938931c69f4feebed08408"} Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.147242 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.148729 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.151235 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.220010 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrvv\" (UniqueName: \"kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.220090 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.220142 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.220770 5004 generic.go:334] "Generic (PLEG): container finished" podID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerID="8640c2b8b74d89ad880f98175496431bfcd64138f551378830fb8c2bed867394" exitCode=0 Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.220819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerDied","Data":"8640c2b8b74d89ad880f98175496431bfcd64138f551378830fb8c2bed867394"} Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.321075 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrvv\" (UniqueName: \"kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.321134 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.321164 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.321634 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.321700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.339476 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrvv\" (UniqueName: \"kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv\") pod \"redhat-operators-xn4vg\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.468382 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:41 crc kubenswrapper[5004]: I1203 14:19:41.880414 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:41 crc kubenswrapper[5004]: W1203 14:19:41.887155 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47a9147c_3367_47a4_b8c4_7217c5b2637c.slice/crio-eb80c6b35260963bbac8529f146a0c564acf10f0b9a69fb48ab9af9fc7770eb7 WatchSource:0}: Error finding container eb80c6b35260963bbac8529f146a0c564acf10f0b9a69fb48ab9af9fc7770eb7: Status 404 returned error can't find the container with id eb80c6b35260963bbac8529f146a0c564acf10f0b9a69fb48ab9af9fc7770eb7 Dec 03 14:19:42 crc kubenswrapper[5004]: I1203 14:19:42.226450 5004 generic.go:334] "Generic (PLEG): container finished" podID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerID="7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245" exitCode=0 Dec 03 14:19:42 crc kubenswrapper[5004]: I1203 14:19:42.226724 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerDied","Data":"7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245"} Dec 03 14:19:42 crc kubenswrapper[5004]: I1203 14:19:42.226751 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerStarted","Data":"eb80c6b35260963bbac8529f146a0c564acf10f0b9a69fb48ab9af9fc7770eb7"} Dec 03 14:19:44 crc kubenswrapper[5004]: I1203 14:19:44.240377 5004 generic.go:334] "Generic (PLEG): container finished" podID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerID="923a557765a36450b4076d9d85cb65498a9300960895d3fe40665e515249d947" exitCode=0 Dec 03 14:19:44 crc kubenswrapper[5004]: I1203 14:19:44.240481 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerDied","Data":"923a557765a36450b4076d9d85cb65498a9300960895d3fe40665e515249d947"} Dec 03 14:19:45 crc kubenswrapper[5004]: I1203 14:19:45.246937 5004 generic.go:334] "Generic (PLEG): container finished" podID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerID="dd4c4cbc1151b22be1b4a24e6da84e75ee67ab38370559d85bb91745686d2dfb" exitCode=0 Dec 03 14:19:45 crc kubenswrapper[5004]: I1203 14:19:45.247010 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerDied","Data":"dd4c4cbc1151b22be1b4a24e6da84e75ee67ab38370559d85bb91745686d2dfb"} Dec 03 14:19:45 crc kubenswrapper[5004]: I1203 14:19:45.248798 5004 generic.go:334] "Generic (PLEG): container finished" podID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerID="65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc" exitCode=0 Dec 03 14:19:45 crc kubenswrapper[5004]: I1203 14:19:45.248833 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerDied","Data":"65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc"} Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.486379 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.584744 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util\") pod \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.584814 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle\") pod \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.584891 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxbvs\" (UniqueName: \"kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs\") pod \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\" (UID: \"2328dd14-b73c-45d2-9ea7-bfb5c246e262\") " Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.585382 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle" (OuterVolumeSpecName: "bundle") pod "2328dd14-b73c-45d2-9ea7-bfb5c246e262" (UID: "2328dd14-b73c-45d2-9ea7-bfb5c246e262"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.585915 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.590104 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs" (OuterVolumeSpecName: "kube-api-access-dxbvs") pod "2328dd14-b73c-45d2-9ea7-bfb5c246e262" (UID: "2328dd14-b73c-45d2-9ea7-bfb5c246e262"). InnerVolumeSpecName "kube-api-access-dxbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.595688 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util" (OuterVolumeSpecName: "util") pod "2328dd14-b73c-45d2-9ea7-bfb5c246e262" (UID: "2328dd14-b73c-45d2-9ea7-bfb5c246e262"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.687542 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2328dd14-b73c-45d2-9ea7-bfb5c246e262-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:46 crc kubenswrapper[5004]: I1203 14:19:46.687843 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxbvs\" (UniqueName: \"kubernetes.io/projected/2328dd14-b73c-45d2-9ea7-bfb5c246e262-kube-api-access-dxbvs\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:47 crc kubenswrapper[5004]: I1203 14:19:47.262393 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" event={"ID":"2328dd14-b73c-45d2-9ea7-bfb5c246e262","Type":"ContainerDied","Data":"873d8dcaa60e1cc0b8b384b6c9f925480febb0fd79938931c69f4feebed08408"} Dec 03 14:19:47 crc kubenswrapper[5004]: I1203 14:19:47.262426 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm" Dec 03 14:19:47 crc kubenswrapper[5004]: I1203 14:19:47.262431 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873d8dcaa60e1cc0b8b384b6c9f925480febb0fd79938931c69f4feebed08408" Dec 03 14:19:47 crc kubenswrapper[5004]: I1203 14:19:47.264678 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerStarted","Data":"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af"} Dec 03 14:19:47 crc kubenswrapper[5004]: I1203 14:19:47.280254 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xn4vg" podStartSLOduration=2.327054716 podStartE2EDuration="6.28023272s" podCreationTimestamp="2025-12-03 14:19:41 +0000 UTC" firstStartedPulling="2025-12-03 14:19:42.22798742 +0000 UTC m=+794.976957656" lastFinishedPulling="2025-12-03 14:19:46.181165414 +0000 UTC m=+798.930135660" observedRunningTime="2025-12-03 14:19:47.278951203 +0000 UTC m=+800.027921429" watchObservedRunningTime="2025-12-03 14:19:47.28023272 +0000 UTC m=+800.029202956" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.223554 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k"] Dec 03 14:19:50 crc kubenswrapper[5004]: E1203 14:19:50.224129 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="extract" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.224146 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="extract" Dec 03 14:19:50 crc kubenswrapper[5004]: E1203 14:19:50.224166 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="pull" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.224174 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="pull" Dec 03 14:19:50 crc kubenswrapper[5004]: E1203 14:19:50.224193 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="util" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.224201 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="util" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.224316 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2328dd14-b73c-45d2-9ea7-bfb5c246e262" containerName="extract" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.224761 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.227121 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.227152 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.227529 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zx5kp" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.230498 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5g6r\" (UniqueName: \"kubernetes.io/projected/5a1faf91-6fda-4e62-801d-bb1624d95274-kube-api-access-s5g6r\") pod \"nmstate-operator-5b5b58f5c8-pf52k\" (UID: \"5a1faf91-6fda-4e62-801d-bb1624d95274\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.244658 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k"] Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.331546 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5g6r\" (UniqueName: \"kubernetes.io/projected/5a1faf91-6fda-4e62-801d-bb1624d95274-kube-api-access-s5g6r\") pod \"nmstate-operator-5b5b58f5c8-pf52k\" (UID: \"5a1faf91-6fda-4e62-801d-bb1624d95274\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.363715 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5g6r\" (UniqueName: \"kubernetes.io/projected/5a1faf91-6fda-4e62-801d-bb1624d95274-kube-api-access-s5g6r\") pod \"nmstate-operator-5b5b58f5c8-pf52k\" (UID: \"5a1faf91-6fda-4e62-801d-bb1624d95274\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.540909 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" Dec 03 14:19:50 crc kubenswrapper[5004]: I1203 14:19:50.766974 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k"] Dec 03 14:19:50 crc kubenswrapper[5004]: W1203 14:19:50.775124 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a1faf91_6fda_4e62_801d_bb1624d95274.slice/crio-567cb297853a6a8afd543b6f5aeeb10b2826d54430aae74f4c7aaadf41b4c5ea WatchSource:0}: Error finding container 567cb297853a6a8afd543b6f5aeeb10b2826d54430aae74f4c7aaadf41b4c5ea: Status 404 returned error can't find the container with id 567cb297853a6a8afd543b6f5aeeb10b2826d54430aae74f4c7aaadf41b4c5ea Dec 03 14:19:51 crc kubenswrapper[5004]: I1203 14:19:51.289582 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" event={"ID":"5a1faf91-6fda-4e62-801d-bb1624d95274","Type":"ContainerStarted","Data":"567cb297853a6a8afd543b6f5aeeb10b2826d54430aae74f4c7aaadf41b4c5ea"} Dec 03 14:19:51 crc kubenswrapper[5004]: I1203 14:19:51.470163 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:51 crc kubenswrapper[5004]: I1203 14:19:51.470229 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:51 crc kubenswrapper[5004]: I1203 14:19:51.508694 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:52 crc kubenswrapper[5004]: I1203 14:19:52.345008 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:52 crc kubenswrapper[5004]: I1203 14:19:52.824243 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:19:52 crc kubenswrapper[5004]: I1203 14:19:52.824329 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:19:53 crc kubenswrapper[5004]: I1203 14:19:53.933907 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.308022 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" event={"ID":"5a1faf91-6fda-4e62-801d-bb1624d95274","Type":"ContainerStarted","Data":"fd6e84292838646ff88c092497eee13a0fbb9e8f1edfcb5fd4410a1e57c5dcd4"} Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.308397 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xn4vg" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="registry-server" containerID="cri-o://c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af" gracePeriod=2 Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.330668 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pf52k" podStartSLOduration=1.208677651 podStartE2EDuration="4.330647823s" podCreationTimestamp="2025-12-03 14:19:50 +0000 UTC" firstStartedPulling="2025-12-03 14:19:50.77652895 +0000 UTC m=+803.525499186" lastFinishedPulling="2025-12-03 14:19:53.898499122 +0000 UTC m=+806.647469358" observedRunningTime="2025-12-03 14:19:54.330108707 +0000 UTC m=+807.079078973" watchObservedRunningTime="2025-12-03 14:19:54.330647823 +0000 UTC m=+807.079618059" Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.694931 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.783664 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities\") pod \"47a9147c-3367-47a4-b8c4-7217c5b2637c\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.783730 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmrvv\" (UniqueName: \"kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv\") pod \"47a9147c-3367-47a4-b8c4-7217c5b2637c\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.783766 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content\") pod \"47a9147c-3367-47a4-b8c4-7217c5b2637c\" (UID: \"47a9147c-3367-47a4-b8c4-7217c5b2637c\") " Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.784717 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities" (OuterVolumeSpecName: "utilities") pod "47a9147c-3367-47a4-b8c4-7217c5b2637c" (UID: "47a9147c-3367-47a4-b8c4-7217c5b2637c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.789419 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv" (OuterVolumeSpecName: "kube-api-access-jmrvv") pod "47a9147c-3367-47a4-b8c4-7217c5b2637c" (UID: "47a9147c-3367-47a4-b8c4-7217c5b2637c"). InnerVolumeSpecName "kube-api-access-jmrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.885393 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmrvv\" (UniqueName: \"kubernetes.io/projected/47a9147c-3367-47a4-b8c4-7217c5b2637c-kube-api-access-jmrvv\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:54 crc kubenswrapper[5004]: I1203 14:19:54.885458 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.314495 5004 generic.go:334] "Generic (PLEG): container finished" podID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerID="c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af" exitCode=0 Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.314545 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerDied","Data":"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af"} Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.314565 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xn4vg" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.314599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xn4vg" event={"ID":"47a9147c-3367-47a4-b8c4-7217c5b2637c","Type":"ContainerDied","Data":"eb80c6b35260963bbac8529f146a0c564acf10f0b9a69fb48ab9af9fc7770eb7"} Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.314615 5004 scope.go:117] "RemoveContainer" containerID="c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.329809 5004 scope.go:117] "RemoveContainer" containerID="65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.345704 5004 scope.go:117] "RemoveContainer" containerID="7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.357901 5004 scope.go:117] "RemoveContainer" containerID="c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af" Dec 03 14:19:55 crc kubenswrapper[5004]: E1203 14:19:55.358251 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af\": container with ID starting with c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af not found: ID does not exist" containerID="c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.358283 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af"} err="failed to get container status \"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af\": rpc error: code = NotFound desc = could not find container \"c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af\": container with ID starting with c283f60fd0f344b2776b5d2d84f758180b0101c130f1ee2b56f0f144528926af not found: ID does not exist" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.358304 5004 scope.go:117] "RemoveContainer" containerID="65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc" Dec 03 14:19:55 crc kubenswrapper[5004]: E1203 14:19:55.358919 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc\": container with ID starting with 65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc not found: ID does not exist" containerID="65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.358952 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc"} err="failed to get container status \"65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc\": rpc error: code = NotFound desc = could not find container \"65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc\": container with ID starting with 65964ba6b8e2ebf2730662371cffc5d7aab1476b20aaa602a0c57c1d6daf5ffc not found: ID does not exist" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.358972 5004 scope.go:117] "RemoveContainer" containerID="7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245" Dec 03 14:19:55 crc kubenswrapper[5004]: E1203 14:19:55.359430 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245\": container with ID starting with 7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245 not found: ID does not exist" containerID="7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245" Dec 03 14:19:55 crc kubenswrapper[5004]: I1203 14:19:55.359491 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245"} err="failed to get container status \"7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245\": rpc error: code = NotFound desc = could not find container \"7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245\": container with ID starting with 7d17565b341831a195b0159140f0522fa91a201e7c1bc90ec9aaa4e049bc4245 not found: ID does not exist" Dec 03 14:19:56 crc kubenswrapper[5004]: I1203 14:19:56.647339 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47a9147c-3367-47a4-b8c4-7217c5b2637c" (UID: "47a9147c-3367-47a4-b8c4-7217c5b2637c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:19:56 crc kubenswrapper[5004]: I1203 14:19:56.706030 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a9147c-3367-47a4-b8c4-7217c5b2637c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:19:56 crc kubenswrapper[5004]: I1203 14:19:56.845277 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:56 crc kubenswrapper[5004]: I1203 14:19:56.851102 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xn4vg"] Dec 03 14:19:57 crc kubenswrapper[5004]: I1203 14:19:57.620244 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" path="/var/lib/kubelet/pods/47a9147c-3367-47a4-b8c4-7217c5b2637c/volumes" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.825456 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn"] Dec 03 14:19:59 crc kubenswrapper[5004]: E1203 14:19:59.825745 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="registry-server" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.825765 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="registry-server" Dec 03 14:19:59 crc kubenswrapper[5004]: E1203 14:19:59.825786 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="extract-utilities" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.825795 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="extract-utilities" Dec 03 14:19:59 crc kubenswrapper[5004]: E1203 14:19:59.825806 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="extract-content" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.825816 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="extract-content" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.825956 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a9147c-3367-47a4-b8c4-7217c5b2637c" containerName="registry-server" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.826601 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.830034 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fhx64" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.837287 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn"] Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.847428 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5"] Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.848134 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.860853 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.872633 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5"] Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.877906 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hdbk9"] Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.879001 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.951679 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght2h\" (UniqueName: \"kubernetes.io/projected/c005e57a-6449-4c48-a81c-deda46fc3d02-kube-api-access-ght2h\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.951739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fwz2\" (UniqueName: \"kubernetes.io/projected/4efdf8bb-b98f-4afa-a605-0bb57c93b999-kube-api-access-5fwz2\") pod \"nmstate-metrics-7f946cbc9-mvxhn\" (UID: \"4efdf8bb-b98f-4afa-a605-0bb57c93b999\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.951794 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.966879 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx"] Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.967932 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.971502 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.971615 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wqsss" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.971736 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 14:19:59 crc kubenswrapper[5004]: I1203 14:19:59.977673 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx"] Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.053276 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght2h\" (UniqueName: \"kubernetes.io/projected/c005e57a-6449-4c48-a81c-deda46fc3d02-kube-api-access-ght2h\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.053334 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fwz2\" (UniqueName: \"kubernetes.io/projected/4efdf8bb-b98f-4afa-a605-0bb57c93b999-kube-api-access-5fwz2\") pod \"nmstate-metrics-7f946cbc9-mvxhn\" (UID: \"4efdf8bb-b98f-4afa-a605-0bb57c93b999\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.053556 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.053629 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-nmstate-lock\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: E1203 14:20:00.053775 5004 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.053778 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57fh\" (UniqueName: \"kubernetes.io/projected/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-kube-api-access-x57fh\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: E1203 14:20:00.054006 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair podName:c005e57a-6449-4c48-a81c-deda46fc3d02 nodeName:}" failed. No retries permitted until 2025-12-03 14:20:00.553902525 +0000 UTC m=+813.302872761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-xbgh5" (UID: "c005e57a-6449-4c48-a81c-deda46fc3d02") : secret "openshift-nmstate-webhook" not found Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.054104 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-dbus-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.054214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-ovs-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.072796 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fwz2\" (UniqueName: \"kubernetes.io/projected/4efdf8bb-b98f-4afa-a605-0bb57c93b999-kube-api-access-5fwz2\") pod \"nmstate-metrics-7f946cbc9-mvxhn\" (UID: \"4efdf8bb-b98f-4afa-a605-0bb57c93b999\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.086676 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght2h\" (UniqueName: \"kubernetes.io/projected/c005e57a-6449-4c48-a81c-deda46fc3d02-kube-api-access-ght2h\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.143666 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154582 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-dbus-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154629 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-ovs-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154685 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-nmstate-lock\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154710 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3279d4-50fa-454f-993b-ce1d1aa33140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154740 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c3279d4-50fa-454f-993b-ce1d1aa33140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154763 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jjb\" (UniqueName: \"kubernetes.io/projected/3c3279d4-50fa-454f-993b-ce1d1aa33140-kube-api-access-v2jjb\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154782 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57fh\" (UniqueName: \"kubernetes.io/projected/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-kube-api-access-x57fh\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154835 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-ovs-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-dbus-socket\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.154999 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-nmstate-lock\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.159968 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bfc754d6-sfd87"] Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.160779 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.175363 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bfc754d6-sfd87"] Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.208578 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57fh\" (UniqueName: \"kubernetes.io/projected/ba4eef2f-7208-44fd-b116-6f394cf2c7e2-kube-api-access-x57fh\") pod \"nmstate-handler-hdbk9\" (UID: \"ba4eef2f-7208-44fd-b116-6f394cf2c7e2\") " pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.255798 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3279d4-50fa-454f-993b-ce1d1aa33140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.256008 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c3279d4-50fa-454f-993b-ce1d1aa33140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.256053 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jjb\" (UniqueName: \"kubernetes.io/projected/3c3279d4-50fa-454f-993b-ce1d1aa33140-kube-api-access-v2jjb\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.256944 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3c3279d4-50fa-454f-993b-ce1d1aa33140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.261792 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3279d4-50fa-454f-993b-ce1d1aa33140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.273685 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jjb\" (UniqueName: \"kubernetes.io/projected/3c3279d4-50fa-454f-993b-ce1d1aa33140-kube-api-access-v2jjb\") pod \"nmstate-console-plugin-7fbb5f6569-pgnkx\" (UID: \"3c3279d4-50fa-454f-993b-ce1d1aa33140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.291036 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357562 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-trusted-ca-bundle\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357633 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-oauth-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357654 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-oauth-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357676 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357702 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-service-ca\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357734 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qj8\" (UniqueName: \"kubernetes.io/projected/be8275af-b224-4ead-a7d0-9287bbe5da57-kube-api-access-g2qj8\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.357767 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-console-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.370546 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn"] Dec 03 14:20:00 crc kubenswrapper[5004]: W1203 14:20:00.378224 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4efdf8bb_b98f_4afa_a605_0bb57c93b999.slice/crio-3290a426d13a726890f2a40860664bf5371f4873468e26d8769a9660ddffba62 WatchSource:0}: Error finding container 3290a426d13a726890f2a40860664bf5371f4873468e26d8769a9660ddffba62: Status 404 returned error can't find the container with id 3290a426d13a726890f2a40860664bf5371f4873468e26d8769a9660ddffba62 Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458550 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-service-ca\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458645 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qj8\" (UniqueName: \"kubernetes.io/projected/be8275af-b224-4ead-a7d0-9287bbe5da57-kube-api-access-g2qj8\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458697 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-console-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458745 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-trusted-ca-bundle\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458805 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-oauth-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458833 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-oauth-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.458881 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.459949 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-console-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.461046 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-trusted-ca-bundle\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.461080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-service-ca\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.461506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be8275af-b224-4ead-a7d0-9287bbe5da57-oauth-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.463775 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-serving-cert\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.464225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be8275af-b224-4ead-a7d0-9287bbe5da57-console-oauth-config\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.479395 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qj8\" (UniqueName: \"kubernetes.io/projected/be8275af-b224-4ead-a7d0-9287bbe5da57-kube-api-access-g2qj8\") pod \"console-5bfc754d6-sfd87\" (UID: \"be8275af-b224-4ead-a7d0-9287bbe5da57\") " pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.482192 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx"] Dec 03 14:20:00 crc kubenswrapper[5004]: W1203 14:20:00.490763 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3279d4_50fa_454f_993b_ce1d1aa33140.slice/crio-75badb755a5a9ebd833ff4597df3014da6dff4026ca610a64af2bf9c1da776c6 WatchSource:0}: Error finding container 75badb755a5a9ebd833ff4597df3014da6dff4026ca610a64af2bf9c1da776c6: Status 404 returned error can't find the container with id 75badb755a5a9ebd833ff4597df3014da6dff4026ca610a64af2bf9c1da776c6 Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.497151 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:00 crc kubenswrapper[5004]: W1203 14:20:00.514220 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4eef2f_7208_44fd_b116_6f394cf2c7e2.slice/crio-e8398a19730c4b047ba0c4d23d1f3045fa13636fe57043317069f65e0f22573b WatchSource:0}: Error finding container e8398a19730c4b047ba0c4d23d1f3045fa13636fe57043317069f65e0f22573b: Status 404 returned error can't find the container with id e8398a19730c4b047ba0c4d23d1f3045fa13636fe57043317069f65e0f22573b Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.536991 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.559690 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.562752 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c005e57a-6449-4c48-a81c-deda46fc3d02-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-xbgh5\" (UID: \"c005e57a-6449-4c48-a81c-deda46fc3d02\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.712508 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bfc754d6-sfd87"] Dec 03 14:20:00 crc kubenswrapper[5004]: W1203 14:20:00.718064 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8275af_b224_4ead_a7d0_9287bbe5da57.slice/crio-3a5fdbc87cff6b44b2e652b92b76fdb07ee7c8b541a897f5d9e2b276e13cf0f4 WatchSource:0}: Error finding container 3a5fdbc87cff6b44b2e652b92b76fdb07ee7c8b541a897f5d9e2b276e13cf0f4: Status 404 returned error can't find the container with id 3a5fdbc87cff6b44b2e652b92b76fdb07ee7c8b541a897f5d9e2b276e13cf0f4 Dec 03 14:20:00 crc kubenswrapper[5004]: I1203 14:20:00.762459 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.139199 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5"] Dec 03 14:20:01 crc kubenswrapper[5004]: W1203 14:20:01.146662 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc005e57a_6449_4c48_a81c_deda46fc3d02.slice/crio-d5bd66fc111a84b52b6f8f76f2e6f3dface8924cb68f037d4376978b0e24ee30 WatchSource:0}: Error finding container d5bd66fc111a84b52b6f8f76f2e6f3dface8924cb68f037d4376978b0e24ee30: Status 404 returned error can't find the container with id d5bd66fc111a84b52b6f8f76f2e6f3dface8924cb68f037d4376978b0e24ee30 Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.355662 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hdbk9" event={"ID":"ba4eef2f-7208-44fd-b116-6f394cf2c7e2","Type":"ContainerStarted","Data":"e8398a19730c4b047ba0c4d23d1f3045fa13636fe57043317069f65e0f22573b"} Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.357370 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" event={"ID":"4efdf8bb-b98f-4afa-a605-0bb57c93b999","Type":"ContainerStarted","Data":"3290a426d13a726890f2a40860664bf5371f4873468e26d8769a9660ddffba62"} Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.358453 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bfc754d6-sfd87" event={"ID":"be8275af-b224-4ead-a7d0-9287bbe5da57","Type":"ContainerStarted","Data":"3a5fdbc87cff6b44b2e652b92b76fdb07ee7c8b541a897f5d9e2b276e13cf0f4"} Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.359533 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" event={"ID":"c005e57a-6449-4c48-a81c-deda46fc3d02","Type":"ContainerStarted","Data":"d5bd66fc111a84b52b6f8f76f2e6f3dface8924cb68f037d4376978b0e24ee30"} Dec 03 14:20:01 crc kubenswrapper[5004]: I1203 14:20:01.360630 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" event={"ID":"3c3279d4-50fa-454f-993b-ce1d1aa33140","Type":"ContainerStarted","Data":"75badb755a5a9ebd833ff4597df3014da6dff4026ca610a64af2bf9c1da776c6"} Dec 03 14:20:03 crc kubenswrapper[5004]: I1203 14:20:03.375328 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bfc754d6-sfd87" event={"ID":"be8275af-b224-4ead-a7d0-9287bbe5da57","Type":"ContainerStarted","Data":"a450a974e0085efdde00ae55ba44bd240358f610daf3609042ed154326a73f99"} Dec 03 14:20:03 crc kubenswrapper[5004]: I1203 14:20:03.397263 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bfc754d6-sfd87" podStartSLOduration=3.397243842 podStartE2EDuration="3.397243842s" podCreationTimestamp="2025-12-03 14:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:20:03.396970324 +0000 UTC m=+816.145940580" watchObservedRunningTime="2025-12-03 14:20:03.397243842 +0000 UTC m=+816.146214078" Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.403208 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" event={"ID":"c005e57a-6449-4c48-a81c-deda46fc3d02","Type":"ContainerStarted","Data":"cddb2c1e62c868ee490c361d156e2410f359ed58330be260aa29f6493e35a1c2"} Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.403707 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.405128 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" event={"ID":"3c3279d4-50fa-454f-993b-ce1d1aa33140","Type":"ContainerStarted","Data":"9eda1f6b942e5304897e04f35fbff9714c0438c2a4c019be8950cb8feecb0da6"} Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.408101 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hdbk9" event={"ID":"ba4eef2f-7208-44fd-b116-6f394cf2c7e2","Type":"ContainerStarted","Data":"b4b9434df2d816d8a192a6d4244b8950151b16869529d401d95a42d28c0766c1"} Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.408274 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.410298 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" event={"ID":"4efdf8bb-b98f-4afa-a605-0bb57c93b999","Type":"ContainerStarted","Data":"1009ddbc154d51c76083c25436330634071c1bfc371553e29ed2f972733300b5"} Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.434497 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" podStartSLOduration=2.657756663 podStartE2EDuration="8.434474623s" podCreationTimestamp="2025-12-03 14:19:59 +0000 UTC" firstStartedPulling="2025-12-03 14:20:01.148694188 +0000 UTC m=+813.897664434" lastFinishedPulling="2025-12-03 14:20:06.925412148 +0000 UTC m=+819.674382394" observedRunningTime="2025-12-03 14:20:07.425352214 +0000 UTC m=+820.174322450" watchObservedRunningTime="2025-12-03 14:20:07.434474623 +0000 UTC m=+820.183444859" Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.450982 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-pgnkx" podStartSLOduration=2.013577783 podStartE2EDuration="8.450965491s" podCreationTimestamp="2025-12-03 14:19:59 +0000 UTC" firstStartedPulling="2025-12-03 14:20:00.493000182 +0000 UTC m=+813.241970418" lastFinishedPulling="2025-12-03 14:20:06.93038789 +0000 UTC m=+819.679358126" observedRunningTime="2025-12-03 14:20:07.447214624 +0000 UTC m=+820.196184870" watchObservedRunningTime="2025-12-03 14:20:07.450965491 +0000 UTC m=+820.199935727" Dec 03 14:20:07 crc kubenswrapper[5004]: I1203 14:20:07.634298 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hdbk9" podStartSLOduration=2.186930465 podStartE2EDuration="8.634278956s" podCreationTimestamp="2025-12-03 14:19:59 +0000 UTC" firstStartedPulling="2025-12-03 14:20:00.522703325 +0000 UTC m=+813.271673561" lastFinishedPulling="2025-12-03 14:20:06.970051816 +0000 UTC m=+819.719022052" observedRunningTime="2025-12-03 14:20:07.469802846 +0000 UTC m=+820.218773082" watchObservedRunningTime="2025-12-03 14:20:07.634278956 +0000 UTC m=+820.383249192" Dec 03 14:20:10 crc kubenswrapper[5004]: I1203 14:20:10.427430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" event={"ID":"4efdf8bb-b98f-4afa-a605-0bb57c93b999","Type":"ContainerStarted","Data":"e21d3033d5c1915b6d041f9e5ead0f0b7acdb041e67534825cc44edb56cea5d3"} Dec 03 14:20:10 crc kubenswrapper[5004]: I1203 14:20:10.447316 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mvxhn" podStartSLOduration=2.536411199 podStartE2EDuration="11.447293086s" podCreationTimestamp="2025-12-03 14:19:59 +0000 UTC" firstStartedPulling="2025-12-03 14:20:00.380640782 +0000 UTC m=+813.129611018" lastFinishedPulling="2025-12-03 14:20:09.291522669 +0000 UTC m=+822.040492905" observedRunningTime="2025-12-03 14:20:10.44356048 +0000 UTC m=+823.192530736" watchObservedRunningTime="2025-12-03 14:20:10.447293086 +0000 UTC m=+823.196263322" Dec 03 14:20:10 crc kubenswrapper[5004]: I1203 14:20:10.537901 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:10 crc kubenswrapper[5004]: I1203 14:20:10.537971 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:10 crc kubenswrapper[5004]: I1203 14:20:10.542293 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:11 crc kubenswrapper[5004]: I1203 14:20:11.435586 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bfc754d6-sfd87" Dec 03 14:20:11 crc kubenswrapper[5004]: I1203 14:20:11.502752 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:20:15 crc kubenswrapper[5004]: I1203 14:20:15.518841 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hdbk9" Dec 03 14:20:20 crc kubenswrapper[5004]: I1203 14:20:20.768753 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-xbgh5" Dec 03 14:20:22 crc kubenswrapper[5004]: I1203 14:20:22.824361 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:20:22 crc kubenswrapper[5004]: I1203 14:20:22.824455 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:20:22 crc kubenswrapper[5004]: I1203 14:20:22.824498 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:20:22 crc kubenswrapper[5004]: I1203 14:20:22.825121 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:20:22 crc kubenswrapper[5004]: I1203 14:20:22.825180 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21" gracePeriod=600 Dec 03 14:20:23 crc kubenswrapper[5004]: I1203 14:20:23.513840 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21" exitCode=0 Dec 03 14:20:23 crc kubenswrapper[5004]: I1203 14:20:23.513894 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21"} Dec 03 14:20:23 crc kubenswrapper[5004]: I1203 14:20:23.514422 5004 scope.go:117] "RemoveContainer" containerID="dcc8ec2ea98d9066af5330ce691b8ab9b42962a34ae3477cfb70c6b1c098fb85" Dec 03 14:20:24 crc kubenswrapper[5004]: I1203 14:20:24.523617 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6"} Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.578468 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr"] Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.581188 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.584238 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.588976 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr"] Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.669603 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszjn\" (UniqueName: \"kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.669950 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.670076 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.771995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.772082 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszjn\" (UniqueName: \"kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.772129 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.772695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.772945 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.801930 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszjn\" (UniqueName: \"kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:34 crc kubenswrapper[5004]: I1203 14:20:34.903139 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:35 crc kubenswrapper[5004]: I1203 14:20:35.128556 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr"] Dec 03 14:20:35 crc kubenswrapper[5004]: I1203 14:20:35.586989 5004 generic.go:334] "Generic (PLEG): container finished" podID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerID="b7260702693fd8f8cd9dbbda53f627ad46cc925f4863f8a4d7f1212f002bbc1f" exitCode=0 Dec 03 14:20:35 crc kubenswrapper[5004]: I1203 14:20:35.587037 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" event={"ID":"e3c17e2f-5008-49fa-9e86-3d63c506af53","Type":"ContainerDied","Data":"b7260702693fd8f8cd9dbbda53f627ad46cc925f4863f8a4d7f1212f002bbc1f"} Dec 03 14:20:35 crc kubenswrapper[5004]: I1203 14:20:35.587082 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" event={"ID":"e3c17e2f-5008-49fa-9e86-3d63c506af53","Type":"ContainerStarted","Data":"5298e352ad3d65830bb7bb8c7048a6dbf1f33c0235c29325187700bf235d780c"} Dec 03 14:20:36 crc kubenswrapper[5004]: I1203 14:20:36.549078 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ll8wz" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerName="console" containerID="cri-o://dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6" gracePeriod=15 Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.427726 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ll8wz_8eede088-bf0c-48cb-b158-d58aa0c58eb0/console/0.log" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.428200 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510364 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510417 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510450 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510492 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510548 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-png7j\" (UniqueName: \"kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.510582 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config\") pod \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\" (UID: \"8eede088-bf0c-48cb-b158-d58aa0c58eb0\") " Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.511425 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config" (OuterVolumeSpecName: "console-config") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.513741 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca" (OuterVolumeSpecName: "service-ca") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.513898 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.514500 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.518894 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.519198 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j" (OuterVolumeSpecName: "kube-api-access-png7j") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "kube-api-access-png7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.519546 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8eede088-bf0c-48cb-b158-d58aa0c58eb0" (UID: "8eede088-bf0c-48cb-b158-d58aa0c58eb0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.601923 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ll8wz_8eede088-bf0c-48cb-b158-d58aa0c58eb0/console/0.log" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.601967 5004 generic.go:334] "Generic (PLEG): container finished" podID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerID="dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6" exitCode=2 Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.601995 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ll8wz" event={"ID":"8eede088-bf0c-48cb-b158-d58aa0c58eb0","Type":"ContainerDied","Data":"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6"} Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.602024 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ll8wz" event={"ID":"8eede088-bf0c-48cb-b158-d58aa0c58eb0","Type":"ContainerDied","Data":"8635654d975694c22781cbfab2f1363c8a2de8b8541b3f81fb91de7bc8c4e7cf"} Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.602040 5004 scope.go:117] "RemoveContainer" containerID="dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.602140 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ll8wz" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612452 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612497 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612510 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612520 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612532 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612541 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-png7j\" (UniqueName: \"kubernetes.io/projected/8eede088-bf0c-48cb-b158-d58aa0c58eb0-kube-api-access-png7j\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.612553 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8eede088-bf0c-48cb-b158-d58aa0c58eb0-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.633486 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.637262 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ll8wz"] Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.642742 5004 scope.go:117] "RemoveContainer" containerID="dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6" Dec 03 14:20:37 crc kubenswrapper[5004]: E1203 14:20:37.643482 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6\": container with ID starting with dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6 not found: ID does not exist" containerID="dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6" Dec 03 14:20:37 crc kubenswrapper[5004]: I1203 14:20:37.643575 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6"} err="failed to get container status \"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6\": rpc error: code = NotFound desc = could not find container \"dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6\": container with ID starting with dad53a2c47f4fb15d73e0ef59333cdb7b22382181e321adc9c4a295c9109fce6 not found: ID does not exist" Dec 03 14:20:38 crc kubenswrapper[5004]: I1203 14:20:38.611255 5004 generic.go:334] "Generic (PLEG): container finished" podID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerID="6f8c12a2154c29a3150132960e4ae93c6d702e5101a105718da3bcee63e2b28e" exitCode=0 Dec 03 14:20:38 crc kubenswrapper[5004]: I1203 14:20:38.611365 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" event={"ID":"e3c17e2f-5008-49fa-9e86-3d63c506af53","Type":"ContainerDied","Data":"6f8c12a2154c29a3150132960e4ae93c6d702e5101a105718da3bcee63e2b28e"} Dec 03 14:20:39 crc kubenswrapper[5004]: I1203 14:20:39.621350 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" path="/var/lib/kubelet/pods/8eede088-bf0c-48cb-b158-d58aa0c58eb0/volumes" Dec 03 14:20:39 crc kubenswrapper[5004]: I1203 14:20:39.622461 5004 generic.go:334] "Generic (PLEG): container finished" podID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerID="a42237d72471b5eaa9ca0bebed32be033556d954d640e4f3da85133a66120e42" exitCode=0 Dec 03 14:20:39 crc kubenswrapper[5004]: I1203 14:20:39.623182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" event={"ID":"e3c17e2f-5008-49fa-9e86-3d63c506af53","Type":"ContainerDied","Data":"a42237d72471b5eaa9ca0bebed32be033556d954d640e4f3da85133a66120e42"} Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.832314 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.955808 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle\") pod \"e3c17e2f-5008-49fa-9e86-3d63c506af53\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.955932 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util\") pod \"e3c17e2f-5008-49fa-9e86-3d63c506af53\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.955995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mszjn\" (UniqueName: \"kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn\") pod \"e3c17e2f-5008-49fa-9e86-3d63c506af53\" (UID: \"e3c17e2f-5008-49fa-9e86-3d63c506af53\") " Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.956966 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle" (OuterVolumeSpecName: "bundle") pod "e3c17e2f-5008-49fa-9e86-3d63c506af53" (UID: "e3c17e2f-5008-49fa-9e86-3d63c506af53"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:20:40 crc kubenswrapper[5004]: I1203 14:20:40.962036 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn" (OuterVolumeSpecName: "kube-api-access-mszjn") pod "e3c17e2f-5008-49fa-9e86-3d63c506af53" (UID: "e3c17e2f-5008-49fa-9e86-3d63c506af53"). InnerVolumeSpecName "kube-api-access-mszjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.057008 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.057048 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mszjn\" (UniqueName: \"kubernetes.io/projected/e3c17e2f-5008-49fa-9e86-3d63c506af53-kube-api-access-mszjn\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.112168 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util" (OuterVolumeSpecName: "util") pod "e3c17e2f-5008-49fa-9e86-3d63c506af53" (UID: "e3c17e2f-5008-49fa-9e86-3d63c506af53"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.157776 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3c17e2f-5008-49fa-9e86-3d63c506af53-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.635411 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" event={"ID":"e3c17e2f-5008-49fa-9e86-3d63c506af53","Type":"ContainerDied","Data":"5298e352ad3d65830bb7bb8c7048a6dbf1f33c0235c29325187700bf235d780c"} Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.635452 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5298e352ad3d65830bb7bb8c7048a6dbf1f33c0235c29325187700bf235d780c" Dec 03 14:20:41 crc kubenswrapper[5004]: I1203 14:20:41.635466 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406051 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m"] Dec 03 14:20:51 crc kubenswrapper[5004]: E1203 14:20:51.406663 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerName="console" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406675 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerName="console" Dec 03 14:20:51 crc kubenswrapper[5004]: E1203 14:20:51.406689 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="pull" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406695 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="pull" Dec 03 14:20:51 crc kubenswrapper[5004]: E1203 14:20:51.406712 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="extract" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406718 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="extract" Dec 03 14:20:51 crc kubenswrapper[5004]: E1203 14:20:51.406727 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="util" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406732 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="util" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406825 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eede088-bf0c-48cb-b158-d58aa0c58eb0" containerName="console" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.406841 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c17e2f-5008-49fa-9e86-3d63c506af53" containerName="extract" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.407266 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.410983 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.411219 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.411322 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.411367 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.411399 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-jm52w" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.475453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-webhook-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.475511 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-apiservice-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.475673 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzjc\" (UniqueName: \"kubernetes.io/projected/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-kube-api-access-hpzjc\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.495783 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m"] Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.577005 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzjc\" (UniqueName: \"kubernetes.io/projected/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-kube-api-access-hpzjc\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.577103 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-webhook-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.577143 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-apiservice-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.582891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-apiservice-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.590923 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-webhook-cert\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.599542 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzjc\" (UniqueName: \"kubernetes.io/projected/6aa91ebc-2da0-4d5e-9847-d5f2758e72e5-kube-api-access-hpzjc\") pod \"metallb-operator-controller-manager-b7864bd46-w9w6m\" (UID: \"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5\") " pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.648765 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c7f47999-twv92"] Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.650111 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.653623 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-t7hrz" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.653880 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.654020 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.672739 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c7f47999-twv92"] Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.723363 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.778934 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-apiservice-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.779177 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-webhook-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.779355 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89gbv\" (UniqueName: \"kubernetes.io/projected/8d864da6-31ee-490f-b4e8-568f95a96ff0-kube-api-access-89gbv\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.880266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-webhook-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.880333 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89gbv\" (UniqueName: \"kubernetes.io/projected/8d864da6-31ee-490f-b4e8-568f95a96ff0-kube-api-access-89gbv\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.880427 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-apiservice-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.887216 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-apiservice-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.887232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d864da6-31ee-490f-b4e8-568f95a96ff0-webhook-cert\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.910570 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89gbv\" (UniqueName: \"kubernetes.io/projected/8d864da6-31ee-490f-b4e8-568f95a96ff0-kube-api-access-89gbv\") pod \"metallb-operator-webhook-server-8c7f47999-twv92\" (UID: \"8d864da6-31ee-490f-b4e8-568f95a96ff0\") " pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:51 crc kubenswrapper[5004]: I1203 14:20:51.964315 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:52 crc kubenswrapper[5004]: I1203 14:20:52.012097 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m"] Dec 03 14:20:52 crc kubenswrapper[5004]: W1203 14:20:52.026956 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa91ebc_2da0_4d5e_9847_d5f2758e72e5.slice/crio-47961e294df8ff698134ae53374c5aab002cde26c49db43c247a4e6b4a15e385 WatchSource:0}: Error finding container 47961e294df8ff698134ae53374c5aab002cde26c49db43c247a4e6b4a15e385: Status 404 returned error can't find the container with id 47961e294df8ff698134ae53374c5aab002cde26c49db43c247a4e6b4a15e385 Dec 03 14:20:52 crc kubenswrapper[5004]: I1203 14:20:52.405274 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c7f47999-twv92"] Dec 03 14:20:52 crc kubenswrapper[5004]: W1203 14:20:52.414288 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d864da6_31ee_490f_b4e8_568f95a96ff0.slice/crio-fb508180977a371b080e8fb9d4abf0b73335302aeecd4da04cbc821127d67d42 WatchSource:0}: Error finding container fb508180977a371b080e8fb9d4abf0b73335302aeecd4da04cbc821127d67d42: Status 404 returned error can't find the container with id fb508180977a371b080e8fb9d4abf0b73335302aeecd4da04cbc821127d67d42 Dec 03 14:20:52 crc kubenswrapper[5004]: I1203 14:20:52.697772 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" event={"ID":"8d864da6-31ee-490f-b4e8-568f95a96ff0","Type":"ContainerStarted","Data":"fb508180977a371b080e8fb9d4abf0b73335302aeecd4da04cbc821127d67d42"} Dec 03 14:20:52 crc kubenswrapper[5004]: I1203 14:20:52.699427 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" event={"ID":"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5","Type":"ContainerStarted","Data":"47961e294df8ff698134ae53374c5aab002cde26c49db43c247a4e6b4a15e385"} Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.134834 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.137311 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.156207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.226350 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.226724 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kgvs\" (UniqueName: \"kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.226773 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.327694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kgvs\" (UniqueName: \"kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.327759 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.327810 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.328282 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.328819 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.378442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kgvs\" (UniqueName: \"kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs\") pod \"certified-operators-fpq7b\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:55 crc kubenswrapper[5004]: I1203 14:20:55.498357 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.494043 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.744188 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" event={"ID":"8d864da6-31ee-490f-b4e8-568f95a96ff0","Type":"ContainerStarted","Data":"2a1a6de2ca00101eb0b87babb4004765216576c097006b9a3a31373274d6d4aa"} Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.744569 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.746108 5004 generic.go:334] "Generic (PLEG): container finished" podID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerID="aff59197d6df426d684bc19a1802e8c1520377bb9f80f7dafeaf05b465c223f1" exitCode=0 Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.746160 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerDied","Data":"aff59197d6df426d684bc19a1802e8c1520377bb9f80f7dafeaf05b465c223f1"} Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.746177 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerStarted","Data":"3d7a031297989e229e31848c6af1bab0cdb18f179ea54daae2d8d16f381421ea"} Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.748062 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" event={"ID":"6aa91ebc-2da0-4d5e-9847-d5f2758e72e5","Type":"ContainerStarted","Data":"d98bfd1483bfd5b0925e5d24d3c62e8afe2aee8a0c6061e5b9d816ceeac147b1"} Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.748153 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.847132 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" podStartSLOduration=1.9884813989999999 podStartE2EDuration="6.847116121s" podCreationTimestamp="2025-12-03 14:20:51 +0000 UTC" firstStartedPulling="2025-12-03 14:20:52.417653411 +0000 UTC m=+865.166623647" lastFinishedPulling="2025-12-03 14:20:57.276288123 +0000 UTC m=+870.025258369" observedRunningTime="2025-12-03 14:20:57.809013429 +0000 UTC m=+870.557983665" watchObservedRunningTime="2025-12-03 14:20:57.847116121 +0000 UTC m=+870.596086357" Dec 03 14:20:57 crc kubenswrapper[5004]: I1203 14:20:57.877204 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" podStartSLOduration=1.641788096 podStartE2EDuration="6.877184185s" podCreationTimestamp="2025-12-03 14:20:51 +0000 UTC" firstStartedPulling="2025-12-03 14:20:52.029068858 +0000 UTC m=+864.778039114" lastFinishedPulling="2025-12-03 14:20:57.264464967 +0000 UTC m=+870.013435203" observedRunningTime="2025-12-03 14:20:57.874351784 +0000 UTC m=+870.623322030" watchObservedRunningTime="2025-12-03 14:20:57.877184185 +0000 UTC m=+870.626154421" Dec 03 14:20:58 crc kubenswrapper[5004]: I1203 14:20:58.754931 5004 generic.go:334] "Generic (PLEG): container finished" podID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerID="ac82bd995ced8afb6d5ca152a5c61d49c55d6ba79d4b6a7265a045ec9e63f7cc" exitCode=0 Dec 03 14:20:58 crc kubenswrapper[5004]: I1203 14:20:58.755040 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerDied","Data":"ac82bd995ced8afb6d5ca152a5c61d49c55d6ba79d4b6a7265a045ec9e63f7cc"} Dec 03 14:20:59 crc kubenswrapper[5004]: I1203 14:20:59.761851 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerStarted","Data":"7c9a3ef2f88f7566cd6ae94d343041c6517d219652102482efe63afa12186dcb"} Dec 03 14:20:59 crc kubenswrapper[5004]: I1203 14:20:59.780589 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fpq7b" podStartSLOduration=3.080890408 podStartE2EDuration="4.780571557s" podCreationTimestamp="2025-12-03 14:20:55 +0000 UTC" firstStartedPulling="2025-12-03 14:20:57.747851442 +0000 UTC m=+870.496821678" lastFinishedPulling="2025-12-03 14:20:59.447532571 +0000 UTC m=+872.196502827" observedRunningTime="2025-12-03 14:20:59.776140701 +0000 UTC m=+872.525110937" watchObservedRunningTime="2025-12-03 14:20:59.780571557 +0000 UTC m=+872.529541793" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.137431 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.139537 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.145586 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.230387 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5z9\" (UniqueName: \"kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.230473 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.230525 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.332166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5z9\" (UniqueName: \"kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.332243 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.332292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.333015 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.333069 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.353197 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5z9\" (UniqueName: \"kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9\") pod \"redhat-marketplace-2vbrw\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.470933 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.718489 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:02 crc kubenswrapper[5004]: W1203 14:21:02.731285 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45b4f31f_bb13_4981_94dd_bec81f1422e5.slice/crio-3975dc35c857ffc8cfdf19ab5969041916876990539e8b907e271a5320ced8de WatchSource:0}: Error finding container 3975dc35c857ffc8cfdf19ab5969041916876990539e8b907e271a5320ced8de: Status 404 returned error can't find the container with id 3975dc35c857ffc8cfdf19ab5969041916876990539e8b907e271a5320ced8de Dec 03 14:21:02 crc kubenswrapper[5004]: I1203 14:21:02.776495 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerStarted","Data":"3975dc35c857ffc8cfdf19ab5969041916876990539e8b907e271a5320ced8de"} Dec 03 14:21:03 crc kubenswrapper[5004]: I1203 14:21:03.784110 5004 generic.go:334] "Generic (PLEG): container finished" podID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerID="537c4f9bca07c70832c48459a23568260bfe8302ac3f59eec86b52f7113936a3" exitCode=0 Dec 03 14:21:03 crc kubenswrapper[5004]: I1203 14:21:03.784157 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerDied","Data":"537c4f9bca07c70832c48459a23568260bfe8302ac3f59eec86b52f7113936a3"} Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.498585 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.498963 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.547523 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.795386 5004 generic.go:334] "Generic (PLEG): container finished" podID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerID="bae954bea07824c9b49555418ce63b7617cfc17b3c18da2c4de173d3492cac9f" exitCode=0 Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.795490 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerDied","Data":"bae954bea07824c9b49555418ce63b7617cfc17b3c18da2c4de173d3492cac9f"} Dec 03 14:21:05 crc kubenswrapper[5004]: I1203 14:21:05.840965 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:06 crc kubenswrapper[5004]: I1203 14:21:06.816007 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerStarted","Data":"37f3ebe42b82473c5c43f174aa275858f69c1a1a7729fd4eb21b966ef23f9b4a"} Dec 03 14:21:06 crc kubenswrapper[5004]: I1203 14:21:06.845134 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2vbrw" podStartSLOduration=2.372041524 podStartE2EDuration="4.845114202s" podCreationTimestamp="2025-12-03 14:21:02 +0000 UTC" firstStartedPulling="2025-12-03 14:21:03.785546511 +0000 UTC m=+876.534516757" lastFinishedPulling="2025-12-03 14:21:06.258619199 +0000 UTC m=+879.007589435" observedRunningTime="2025-12-03 14:21:06.841930071 +0000 UTC m=+879.590900307" watchObservedRunningTime="2025-12-03 14:21:06.845114202 +0000 UTC m=+879.594084438" Dec 03 14:21:09 crc kubenswrapper[5004]: I1203 14:21:09.328514 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:21:10 crc kubenswrapper[5004]: I1203 14:21:09.329667 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fpq7b" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="registry-server" containerID="cri-o://7c9a3ef2f88f7566cd6ae94d343041c6517d219652102482efe63afa12186dcb" gracePeriod=2 Dec 03 14:21:10 crc kubenswrapper[5004]: I1203 14:21:10.837696 5004 generic.go:334] "Generic (PLEG): container finished" podID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerID="7c9a3ef2f88f7566cd6ae94d343041c6517d219652102482efe63afa12186dcb" exitCode=0 Dec 03 14:21:10 crc kubenswrapper[5004]: I1203 14:21:10.837769 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerDied","Data":"7c9a3ef2f88f7566cd6ae94d343041c6517d219652102482efe63afa12186dcb"} Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.164383 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.240099 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities\") pod \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.240161 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content\") pod \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.240223 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kgvs\" (UniqueName: \"kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs\") pod \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\" (UID: \"84670d80-1b6a-41f2-a31e-a929e38dcfcd\") " Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.241050 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities" (OuterVolumeSpecName: "utilities") pod "84670d80-1b6a-41f2-a31e-a929e38dcfcd" (UID: "84670d80-1b6a-41f2-a31e-a929e38dcfcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.246305 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs" (OuterVolumeSpecName: "kube-api-access-2kgvs") pod "84670d80-1b6a-41f2-a31e-a929e38dcfcd" (UID: "84670d80-1b6a-41f2-a31e-a929e38dcfcd"). InnerVolumeSpecName "kube-api-access-2kgvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.283839 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84670d80-1b6a-41f2-a31e-a929e38dcfcd" (UID: "84670d80-1b6a-41f2-a31e-a929e38dcfcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.341665 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.341722 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kgvs\" (UniqueName: \"kubernetes.io/projected/84670d80-1b6a-41f2-a31e-a929e38dcfcd-kube-api-access-2kgvs\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.341738 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84670d80-1b6a-41f2-a31e-a929e38dcfcd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.847245 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpq7b" event={"ID":"84670d80-1b6a-41f2-a31e-a929e38dcfcd","Type":"ContainerDied","Data":"3d7a031297989e229e31848c6af1bab0cdb18f179ea54daae2d8d16f381421ea"} Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.847302 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpq7b" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.847763 5004 scope.go:117] "RemoveContainer" containerID="7c9a3ef2f88f7566cd6ae94d343041c6517d219652102482efe63afa12186dcb" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.871749 5004 scope.go:117] "RemoveContainer" containerID="ac82bd995ced8afb6d5ca152a5c61d49c55d6ba79d4b6a7265a045ec9e63f7cc" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.873225 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.877959 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fpq7b"] Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.887034 5004 scope.go:117] "RemoveContainer" containerID="aff59197d6df426d684bc19a1802e8c1520377bb9f80f7dafeaf05b465c223f1" Dec 03 14:21:11 crc kubenswrapper[5004]: I1203 14:21:11.972239 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8c7f47999-twv92" Dec 03 14:21:12 crc kubenswrapper[5004]: I1203 14:21:12.471942 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:12 crc kubenswrapper[5004]: I1203 14:21:12.472402 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:12 crc kubenswrapper[5004]: I1203 14:21:12.550550 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:12 crc kubenswrapper[5004]: I1203 14:21:12.894415 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:13 crc kubenswrapper[5004]: I1203 14:21:13.624995 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" path="/var/lib/kubelet/pods/84670d80-1b6a-41f2-a31e-a929e38dcfcd/volumes" Dec 03 14:21:16 crc kubenswrapper[5004]: I1203 14:21:16.127997 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:16 crc kubenswrapper[5004]: I1203 14:21:16.128224 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2vbrw" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="registry-server" containerID="cri-o://37f3ebe42b82473c5c43f174aa275858f69c1a1a7729fd4eb21b966ef23f9b4a" gracePeriod=2 Dec 03 14:21:16 crc kubenswrapper[5004]: I1203 14:21:16.879329 5004 generic.go:334] "Generic (PLEG): container finished" podID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerID="37f3ebe42b82473c5c43f174aa275858f69c1a1a7729fd4eb21b966ef23f9b4a" exitCode=0 Dec 03 14:21:16 crc kubenswrapper[5004]: I1203 14:21:16.879420 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerDied","Data":"37f3ebe42b82473c5c43f174aa275858f69c1a1a7729fd4eb21b966ef23f9b4a"} Dec 03 14:21:16 crc kubenswrapper[5004]: I1203 14:21:16.988771 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.009360 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities\") pod \"45b4f31f-bb13-4981-94dd-bec81f1422e5\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.009521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content\") pod \"45b4f31f-bb13-4981-94dd-bec81f1422e5\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.009580 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw5z9\" (UniqueName: \"kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9\") pod \"45b4f31f-bb13-4981-94dd-bec81f1422e5\" (UID: \"45b4f31f-bb13-4981-94dd-bec81f1422e5\") " Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.010258 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities" (OuterVolumeSpecName: "utilities") pod "45b4f31f-bb13-4981-94dd-bec81f1422e5" (UID: "45b4f31f-bb13-4981-94dd-bec81f1422e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.015795 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9" (OuterVolumeSpecName: "kube-api-access-gw5z9") pod "45b4f31f-bb13-4981-94dd-bec81f1422e5" (UID: "45b4f31f-bb13-4981-94dd-bec81f1422e5"). InnerVolumeSpecName "kube-api-access-gw5z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.029176 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45b4f31f-bb13-4981-94dd-bec81f1422e5" (UID: "45b4f31f-bb13-4981-94dd-bec81f1422e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.111060 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw5z9\" (UniqueName: \"kubernetes.io/projected/45b4f31f-bb13-4981-94dd-bec81f1422e5-kube-api-access-gw5z9\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.111103 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.111115 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b4f31f-bb13-4981-94dd-bec81f1422e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.888175 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vbrw" event={"ID":"45b4f31f-bb13-4981-94dd-bec81f1422e5","Type":"ContainerDied","Data":"3975dc35c857ffc8cfdf19ab5969041916876990539e8b907e271a5320ced8de"} Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.888238 5004 scope.go:117] "RemoveContainer" containerID="37f3ebe42b82473c5c43f174aa275858f69c1a1a7729fd4eb21b966ef23f9b4a" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.888244 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vbrw" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.909303 5004 scope.go:117] "RemoveContainer" containerID="bae954bea07824c9b49555418ce63b7617cfc17b3c18da2c4de173d3492cac9f" Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.914710 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.923075 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vbrw"] Dec 03 14:21:17 crc kubenswrapper[5004]: I1203 14:21:17.930102 5004 scope.go:117] "RemoveContainer" containerID="537c4f9bca07c70832c48459a23568260bfe8302ac3f59eec86b52f7113936a3" Dec 03 14:21:19 crc kubenswrapper[5004]: I1203 14:21:19.622288 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" path="/var/lib/kubelet/pods/45b4f31f-bb13-4981-94dd-bec81f1422e5/volumes" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.344254 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.344875 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.344895 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.344912 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="extract-content" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.344920 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="extract-content" Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.344934 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="extract-content" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.344944 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="extract-content" Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.344955 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="extract-utilities" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.344963 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="extract-utilities" Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.344994 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="extract-utilities" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.345002 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="extract-utilities" Dec 03 14:21:23 crc kubenswrapper[5004]: E1203 14:21:23.345011 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.345018 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.345151 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b4f31f-bb13-4981-94dd-bec81f1422e5" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.345166 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="84670d80-1b6a-41f2-a31e-a929e38dcfcd" containerName="registry-server" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.346149 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.358389 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.490694 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.490743 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsnjb\" (UniqueName: \"kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.490762 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.592185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.592231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsnjb\" (UniqueName: \"kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.592248 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.592613 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.592668 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.623687 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsnjb\" (UniqueName: \"kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb\") pod \"community-operators-zgsgx\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.670900 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:23 crc kubenswrapper[5004]: I1203 14:21:23.947936 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:24 crc kubenswrapper[5004]: I1203 14:21:24.952868 5004 generic.go:334] "Generic (PLEG): container finished" podID="bdcecf36-3a19-4415-b396-87bc31f12378" containerID="e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb" exitCode=0 Dec 03 14:21:24 crc kubenswrapper[5004]: I1203 14:21:24.952932 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerDied","Data":"e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb"} Dec 03 14:21:24 crc kubenswrapper[5004]: I1203 14:21:24.953170 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerStarted","Data":"f094940eff3b5750cb487e7f5960a4ff117c917919dbe9872ff8b8944d1d59b7"} Dec 03 14:21:26 crc kubenswrapper[5004]: I1203 14:21:26.965077 5004 generic.go:334] "Generic (PLEG): container finished" podID="bdcecf36-3a19-4415-b396-87bc31f12378" containerID="4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88" exitCode=0 Dec 03 14:21:26 crc kubenswrapper[5004]: I1203 14:21:26.965145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerDied","Data":"4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88"} Dec 03 14:21:27 crc kubenswrapper[5004]: I1203 14:21:27.972882 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerStarted","Data":"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179"} Dec 03 14:21:27 crc kubenswrapper[5004]: I1203 14:21:27.996416 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zgsgx" podStartSLOduration=2.544615688 podStartE2EDuration="4.996394208s" podCreationTimestamp="2025-12-03 14:21:23 +0000 UTC" firstStartedPulling="2025-12-03 14:21:24.956006037 +0000 UTC m=+897.704976303" lastFinishedPulling="2025-12-03 14:21:27.407784577 +0000 UTC m=+900.156754823" observedRunningTime="2025-12-03 14:21:27.994996958 +0000 UTC m=+900.743967194" watchObservedRunningTime="2025-12-03 14:21:27.996394208 +0000 UTC m=+900.745364444" Dec 03 14:21:31 crc kubenswrapper[5004]: I1203 14:21:31.726809 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-b7864bd46-w9w6m" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.525650 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9r8qk"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.529091 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.530592 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q6tq2" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.530896 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.532344 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.533110 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.533480 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.540732 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.555346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.629521 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6kv8g"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.632456 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.635231 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9m47v" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.635488 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.635823 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.639715 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.647275 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-vfwkq"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.662245 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.664533 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.678914 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vfwkq"] Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.712833 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttps9\" (UniqueName: \"kubernetes.io/projected/eba563db-6e27-423d-8739-ea22c19318ac-kube-api-access-ttps9\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.712903 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eba563db-6e27-423d-8739-ea22c19318ac-frr-startup\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.712945 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-sockets\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.712975 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p892\" (UniqueName: \"kubernetes.io/projected/b1856a9d-f833-48a2-941b-8c9fd3f06416-kube-api-access-6p892\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.713016 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eba563db-6e27-423d-8739-ea22c19318ac-metrics-certs\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.713053 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-metrics\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.713076 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-reloader\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.713101 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1856a9d-f833-48a2-941b-8c9fd3f06416-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.713122 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-conf\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814491 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttps9\" (UniqueName: \"kubernetes.io/projected/eba563db-6e27-423d-8739-ea22c19318ac-kube-api-access-ttps9\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814536 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eba563db-6e27-423d-8739-ea22c19318ac-frr-startup\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814560 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-metrics-certs\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814602 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-metrics-certs\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-sockets\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814666 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktl7g\" (UniqueName: \"kubernetes.io/projected/0af74282-be81-45a2-966a-4dcb279d7c6a-kube-api-access-ktl7g\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814687 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p892\" (UniqueName: \"kubernetes.io/projected/b1856a9d-f833-48a2-941b-8c9fd3f06416-kube-api-access-6p892\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814701 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/27227413-e203-4218-942d-35c1493b7015-metallb-excludel2\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814753 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eba563db-6e27-423d-8739-ea22c19318ac-metrics-certs\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814776 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-cert\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814811 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxgx\" (UniqueName: \"kubernetes.io/projected/27227413-e203-4218-942d-35c1493b7015-kube-api-access-5gxgx\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814824 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814891 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-metrics\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814920 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-reloader\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1856a9d-f833-48a2-941b-8c9fd3f06416-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.814985 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-conf\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.815475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eba563db-6e27-423d-8739-ea22c19318ac-frr-startup\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.816181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-sockets\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.816345 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-metrics\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.816369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-frr-conf\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.816633 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eba563db-6e27-423d-8739-ea22c19318ac-reloader\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.826378 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eba563db-6e27-423d-8739-ea22c19318ac-metrics-certs\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.830393 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1856a9d-f833-48a2-941b-8c9fd3f06416-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.833687 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttps9\" (UniqueName: \"kubernetes.io/projected/eba563db-6e27-423d-8739-ea22c19318ac-kube-api-access-ttps9\") pod \"frr-k8s-9r8qk\" (UID: \"eba563db-6e27-423d-8739-ea22c19318ac\") " pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.835351 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p892\" (UniqueName: \"kubernetes.io/projected/b1856a9d-f833-48a2-941b-8c9fd3f06416-kube-api-access-6p892\") pod \"frr-k8s-webhook-server-7fcb986d4-jksx8\" (UID: \"b1856a9d-f833-48a2-941b-8c9fd3f06416\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.854738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.862322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917229 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-metrics-certs\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-metrics-certs\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktl7g\" (UniqueName: \"kubernetes.io/projected/0af74282-be81-45a2-966a-4dcb279d7c6a-kube-api-access-ktl7g\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917356 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/27227413-e203-4218-942d-35c1493b7015-metallb-excludel2\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917393 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-cert\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxgx\" (UniqueName: \"kubernetes.io/projected/27227413-e203-4218-942d-35c1493b7015-kube-api-access-5gxgx\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.917443 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: E1203 14:21:32.917567 5004 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 14:21:32 crc kubenswrapper[5004]: E1203 14:21:32.917634 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist podName:27227413-e203-4218-942d-35c1493b7015 nodeName:}" failed. No retries permitted until 2025-12-03 14:21:33.417613785 +0000 UTC m=+906.166584021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist") pod "speaker-6kv8g" (UID: "27227413-e203-4218-942d-35c1493b7015") : secret "metallb-memberlist" not found Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.918359 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/27227413-e203-4218-942d-35c1493b7015-metallb-excludel2\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.920882 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.922519 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-metrics-certs\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.923574 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-metrics-certs\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.931060 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0af74282-be81-45a2-966a-4dcb279d7c6a-cert\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.937479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktl7g\" (UniqueName: \"kubernetes.io/projected/0af74282-be81-45a2-966a-4dcb279d7c6a-kube-api-access-ktl7g\") pod \"controller-f8648f98b-vfwkq\" (UID: \"0af74282-be81-45a2-966a-4dcb279d7c6a\") " pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.950770 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxgx\" (UniqueName: \"kubernetes.io/projected/27227413-e203-4218-942d-35c1493b7015-kube-api-access-5gxgx\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:32 crc kubenswrapper[5004]: I1203 14:21:32.980350 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.327004 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8"] Dec 03 14:21:33 crc kubenswrapper[5004]: W1203 14:21:33.336849 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1856a9d_f833_48a2_941b_8c9fd3f06416.slice/crio-bb85336f4f17ec5a5abfb87c2b48aaf2a2f646bef2a4a6b4e73dcfd721778ff7 WatchSource:0}: Error finding container bb85336f4f17ec5a5abfb87c2b48aaf2a2f646bef2a4a6b4e73dcfd721778ff7: Status 404 returned error can't find the container with id bb85336f4f17ec5a5abfb87c2b48aaf2a2f646bef2a4a6b4e73dcfd721778ff7 Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.401935 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vfwkq"] Dec 03 14:21:33 crc kubenswrapper[5004]: W1203 14:21:33.404797 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af74282_be81_45a2_966a_4dcb279d7c6a.slice/crio-50e9d8b8e84e50abaf8b5d61426398dcd2c695d0fcb1fb223ad0013b8c1a4f07 WatchSource:0}: Error finding container 50e9d8b8e84e50abaf8b5d61426398dcd2c695d0fcb1fb223ad0013b8c1a4f07: Status 404 returned error can't find the container with id 50e9d8b8e84e50abaf8b5d61426398dcd2c695d0fcb1fb223ad0013b8c1a4f07 Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.423271 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:33 crc kubenswrapper[5004]: E1203 14:21:33.423446 5004 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 14:21:33 crc kubenswrapper[5004]: E1203 14:21:33.423527 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist podName:27227413-e203-4218-942d-35c1493b7015 nodeName:}" failed. No retries permitted until 2025-12-03 14:21:34.423504786 +0000 UTC m=+907.172475022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist") pod "speaker-6kv8g" (UID: "27227413-e203-4218-942d-35c1493b7015") : secret "metallb-memberlist" not found Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.671326 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.672691 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:33 crc kubenswrapper[5004]: I1203 14:21:33.721893 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.020798 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" event={"ID":"b1856a9d-f833-48a2-941b-8c9fd3f06416","Type":"ContainerStarted","Data":"bb85336f4f17ec5a5abfb87c2b48aaf2a2f646bef2a4a6b4e73dcfd721778ff7"} Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.022561 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vfwkq" event={"ID":"0af74282-be81-45a2-966a-4dcb279d7c6a","Type":"ContainerStarted","Data":"764239d77d4d1c18248d89f449b7f42fb3378f25a643333be3ea5f4ff6299870"} Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.022616 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vfwkq" event={"ID":"0af74282-be81-45a2-966a-4dcb279d7c6a","Type":"ContainerStarted","Data":"07dc61ede656e923b135af522e2719eadc0ab481e55d1e0e3f5a279a94a4963b"} Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.022630 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vfwkq" event={"ID":"0af74282-be81-45a2-966a-4dcb279d7c6a","Type":"ContainerStarted","Data":"50e9d8b8e84e50abaf8b5d61426398dcd2c695d0fcb1fb223ad0013b8c1a4f07"} Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.022774 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.023659 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"c9aaea701402e34ec2483f693895ae052848e8055f0f9523081aa9948dedc74e"} Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.045941 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-vfwkq" podStartSLOduration=2.045921439 podStartE2EDuration="2.045921439s" podCreationTimestamp="2025-12-03 14:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:21:34.043391196 +0000 UTC m=+906.792361432" watchObservedRunningTime="2025-12-03 14:21:34.045921439 +0000 UTC m=+906.794891675" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.071183 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.109842 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.435308 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.445606 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/27227413-e203-4218-942d-35c1493b7015-memberlist\") pod \"speaker-6kv8g\" (UID: \"27227413-e203-4218-942d-35c1493b7015\") " pod="metallb-system/speaker-6kv8g" Dec 03 14:21:34 crc kubenswrapper[5004]: I1203 14:21:34.453196 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6kv8g" Dec 03 14:21:34 crc kubenswrapper[5004]: W1203 14:21:34.479015 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27227413_e203_4218_942d_35c1493b7015.slice/crio-62b5fd722e777a445281dfe65dbf72c0634a5648d9dffc1c94959735b93d158b WatchSource:0}: Error finding container 62b5fd722e777a445281dfe65dbf72c0634a5648d9dffc1c94959735b93d158b: Status 404 returned error can't find the container with id 62b5fd722e777a445281dfe65dbf72c0634a5648d9dffc1c94959735b93d158b Dec 03 14:21:35 crc kubenswrapper[5004]: I1203 14:21:35.050199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6kv8g" event={"ID":"27227413-e203-4218-942d-35c1493b7015","Type":"ContainerStarted","Data":"83f6424deeb45c721d49c508a9501b0ae731d954e2016a58fa3cf13cbaeb0324"} Dec 03 14:21:35 crc kubenswrapper[5004]: I1203 14:21:35.050468 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6kv8g" event={"ID":"27227413-e203-4218-942d-35c1493b7015","Type":"ContainerStarted","Data":"62b5fd722e777a445281dfe65dbf72c0634a5648d9dffc1c94959735b93d158b"} Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.060996 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zgsgx" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="registry-server" containerID="cri-o://834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179" gracePeriod=2 Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.061595 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6kv8g" event={"ID":"27227413-e203-4218-942d-35c1493b7015","Type":"ContainerStarted","Data":"18a062e11ff8f10362dcec77dd1c8663fe12aa68db1b94760f0cb4efd9ea6696"} Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.061999 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6kv8g" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.084332 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6kv8g" podStartSLOduration=4.084308093 podStartE2EDuration="4.084308093s" podCreationTimestamp="2025-12-03 14:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:21:36.080628397 +0000 UTC m=+908.829598633" watchObservedRunningTime="2025-12-03 14:21:36.084308093 +0000 UTC m=+908.833278329" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.533013 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.575073 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities\") pod \"bdcecf36-3a19-4415-b396-87bc31f12378\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.575139 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsnjb\" (UniqueName: \"kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb\") pod \"bdcecf36-3a19-4415-b396-87bc31f12378\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.576049 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities" (OuterVolumeSpecName: "utilities") pod "bdcecf36-3a19-4415-b396-87bc31f12378" (UID: "bdcecf36-3a19-4415-b396-87bc31f12378"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.576087 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content\") pod \"bdcecf36-3a19-4415-b396-87bc31f12378\" (UID: \"bdcecf36-3a19-4415-b396-87bc31f12378\") " Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.576648 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.580417 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb" (OuterVolumeSpecName: "kube-api-access-zsnjb") pod "bdcecf36-3a19-4415-b396-87bc31f12378" (UID: "bdcecf36-3a19-4415-b396-87bc31f12378"). InnerVolumeSpecName "kube-api-access-zsnjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.671231 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdcecf36-3a19-4415-b396-87bc31f12378" (UID: "bdcecf36-3a19-4415-b396-87bc31f12378"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.677430 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcecf36-3a19-4415-b396-87bc31f12378-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:36 crc kubenswrapper[5004]: I1203 14:21:36.677464 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsnjb\" (UniqueName: \"kubernetes.io/projected/bdcecf36-3a19-4415-b396-87bc31f12378-kube-api-access-zsnjb\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.073841 5004 generic.go:334] "Generic (PLEG): container finished" podID="bdcecf36-3a19-4415-b396-87bc31f12378" containerID="834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179" exitCode=0 Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.073917 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zgsgx" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.073936 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerDied","Data":"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179"} Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.074026 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zgsgx" event={"ID":"bdcecf36-3a19-4415-b396-87bc31f12378","Type":"ContainerDied","Data":"f094940eff3b5750cb487e7f5960a4ff117c917919dbe9872ff8b8944d1d59b7"} Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.074050 5004 scope.go:117] "RemoveContainer" containerID="834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.105592 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.110669 5004 scope.go:117] "RemoveContainer" containerID="4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.112213 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zgsgx"] Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.127380 5004 scope.go:117] "RemoveContainer" containerID="e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.147409 5004 scope.go:117] "RemoveContainer" containerID="834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179" Dec 03 14:21:37 crc kubenswrapper[5004]: E1203 14:21:37.148405 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179\": container with ID starting with 834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179 not found: ID does not exist" containerID="834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.148455 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179"} err="failed to get container status \"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179\": rpc error: code = NotFound desc = could not find container \"834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179\": container with ID starting with 834b873a9aba558ff93b20a8e145f0a87fe84b7ef81feed9320149bad0814179 not found: ID does not exist" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.148539 5004 scope.go:117] "RemoveContainer" containerID="4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88" Dec 03 14:21:37 crc kubenswrapper[5004]: E1203 14:21:37.149519 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88\": container with ID starting with 4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88 not found: ID does not exist" containerID="4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.149549 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88"} err="failed to get container status \"4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88\": rpc error: code = NotFound desc = could not find container \"4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88\": container with ID starting with 4b76be7439228ead2f3b2c1d1805eeb0050957be7f70a634144384d413cb7c88 not found: ID does not exist" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.149567 5004 scope.go:117] "RemoveContainer" containerID="e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb" Dec 03 14:21:37 crc kubenswrapper[5004]: E1203 14:21:37.149952 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb\": container with ID starting with e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb not found: ID does not exist" containerID="e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.149977 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb"} err="failed to get container status \"e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb\": rpc error: code = NotFound desc = could not find container \"e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb\": container with ID starting with e975538957b3b7cad5db540b655f115466e6a5349f7e403cb1c2a35f123a48bb not found: ID does not exist" Dec 03 14:21:37 crc kubenswrapper[5004]: I1203 14:21:37.621884 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" path="/var/lib/kubelet/pods/bdcecf36-3a19-4415-b396-87bc31f12378/volumes" Dec 03 14:21:41 crc kubenswrapper[5004]: I1203 14:21:41.098881 5004 generic.go:334] "Generic (PLEG): container finished" podID="eba563db-6e27-423d-8739-ea22c19318ac" containerID="1e1fc11a5725b78a7a13690038259d0b9aab954306a3d7d281ef39fbd3f675eb" exitCode=0 Dec 03 14:21:41 crc kubenswrapper[5004]: I1203 14:21:41.098922 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerDied","Data":"1e1fc11a5725b78a7a13690038259d0b9aab954306a3d7d281ef39fbd3f675eb"} Dec 03 14:21:41 crc kubenswrapper[5004]: I1203 14:21:41.101081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" event={"ID":"b1856a9d-f833-48a2-941b-8c9fd3f06416","Type":"ContainerStarted","Data":"1052d249ff66f224d13dc787633b7655a2d70d5b948ad896dc4e2237e51fbf5d"} Dec 03 14:21:41 crc kubenswrapper[5004]: I1203 14:21:41.101217 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:41 crc kubenswrapper[5004]: I1203 14:21:41.134834 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" podStartSLOduration=1.6744027369999999 podStartE2EDuration="9.134812578s" podCreationTimestamp="2025-12-03 14:21:32 +0000 UTC" firstStartedPulling="2025-12-03 14:21:33.339342926 +0000 UTC m=+906.088313162" lastFinishedPulling="2025-12-03 14:21:40.799752767 +0000 UTC m=+913.548723003" observedRunningTime="2025-12-03 14:21:41.134668973 +0000 UTC m=+913.883639219" watchObservedRunningTime="2025-12-03 14:21:41.134812578 +0000 UTC m=+913.883782814" Dec 03 14:21:42 crc kubenswrapper[5004]: I1203 14:21:42.107931 5004 generic.go:334] "Generic (PLEG): container finished" podID="eba563db-6e27-423d-8739-ea22c19318ac" containerID="4e36ead149f8e4ac80e29f7aafaf413b68ac07b33d769c965778bde89212e5b5" exitCode=0 Dec 03 14:21:42 crc kubenswrapper[5004]: I1203 14:21:42.108011 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerDied","Data":"4e36ead149f8e4ac80e29f7aafaf413b68ac07b33d769c965778bde89212e5b5"} Dec 03 14:21:43 crc kubenswrapper[5004]: I1203 14:21:43.115080 5004 generic.go:334] "Generic (PLEG): container finished" podID="eba563db-6e27-423d-8739-ea22c19318ac" containerID="bd3aa19fdfd67d0ca376feb3d593984a7b0ca0b5356ad4754fcc0c2cf07dc290" exitCode=0 Dec 03 14:21:43 crc kubenswrapper[5004]: I1203 14:21:43.115203 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerDied","Data":"bd3aa19fdfd67d0ca376feb3d593984a7b0ca0b5356ad4754fcc0c2cf07dc290"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.127756 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"a546694ecbafe8fb2e271999c490f606465f6a1f3d27e008bdbe0a041ccfde50"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128107 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"a2f7a3aa41b6097a154d54fd2c3b44b6110caecbc6849b0833af3d26f4d543fd"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128122 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"c460ac57a08ee4a9d8d7c7da0d0fd09b207864b253cc7204ad75d48bf86e83ac"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128134 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"6f96e565fbbf32138cb4291392fe46dd7bb171c1c0f54f70baea2b1c025485f7"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"bb13c426578a7b91adaff2e8ed0763d5c7d74f340eef1b81aa46c748c57f16be"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128153 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9r8qk" event={"ID":"eba563db-6e27-423d-8739-ea22c19318ac","Type":"ContainerStarted","Data":"619077b2a353c7c785fee18afa017f217865cf001fcb28d0058e874221bdd04f"} Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.128208 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.152492 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9r8qk" podStartSLOduration=4.461976921 podStartE2EDuration="12.152478285s" podCreationTimestamp="2025-12-03 14:21:32 +0000 UTC" firstStartedPulling="2025-12-03 14:21:33.076540856 +0000 UTC m=+905.825511092" lastFinishedPulling="2025-12-03 14:21:40.76704222 +0000 UTC m=+913.516012456" observedRunningTime="2025-12-03 14:21:44.151644461 +0000 UTC m=+916.900614717" watchObservedRunningTime="2025-12-03 14:21:44.152478285 +0000 UTC m=+916.901448521" Dec 03 14:21:44 crc kubenswrapper[5004]: I1203 14:21:44.458053 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6kv8g" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.243928 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:47 crc kubenswrapper[5004]: E1203 14:21:47.244625 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="extract-utilities" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.244637 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="extract-utilities" Dec 03 14:21:47 crc kubenswrapper[5004]: E1203 14:21:47.244655 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="extract-content" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.244661 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="extract-content" Dec 03 14:21:47 crc kubenswrapper[5004]: E1203 14:21:47.244678 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="registry-server" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.244684 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="registry-server" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.245058 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdcecf36-3a19-4415-b396-87bc31f12378" containerName="registry-server" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.250760 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.250851 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.253945 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xq4n6" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.256967 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.257456 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.423901 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhcg\" (UniqueName: \"kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg\") pod \"openstack-operator-index-w4bxt\" (UID: \"abbf21e6-2f41-4806-8512-9f01fba0afab\") " pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.525714 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhcg\" (UniqueName: \"kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg\") pod \"openstack-operator-index-w4bxt\" (UID: \"abbf21e6-2f41-4806-8512-9f01fba0afab\") " pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.545666 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhcg\" (UniqueName: \"kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg\") pod \"openstack-operator-index-w4bxt\" (UID: \"abbf21e6-2f41-4806-8512-9f01fba0afab\") " pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.615545 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.855250 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:47 crc kubenswrapper[5004]: I1203 14:21:47.894762 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:21:48 crc kubenswrapper[5004]: I1203 14:21:48.004279 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:48 crc kubenswrapper[5004]: W1203 14:21:48.011401 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbf21e6_2f41_4806_8512_9f01fba0afab.slice/crio-b7248403eab482fc32f3b57a440d3839931d100250359908521e696d3350f8c3 WatchSource:0}: Error finding container b7248403eab482fc32f3b57a440d3839931d100250359908521e696d3350f8c3: Status 404 returned error can't find the container with id b7248403eab482fc32f3b57a440d3839931d100250359908521e696d3350f8c3 Dec 03 14:21:48 crc kubenswrapper[5004]: I1203 14:21:48.153956 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4bxt" event={"ID":"abbf21e6-2f41-4806-8512-9f01fba0afab","Type":"ContainerStarted","Data":"b7248403eab482fc32f3b57a440d3839931d100250359908521e696d3350f8c3"} Dec 03 14:21:50 crc kubenswrapper[5004]: I1203 14:21:50.406395 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.013240 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x4d5l"] Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.014247 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.024810 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x4d5l"] Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.183362 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jlc\" (UniqueName: \"kubernetes.io/projected/dc67c749-644a-416d-8f75-ebd340795204-kube-api-access-55jlc\") pod \"openstack-operator-index-x4d5l\" (UID: \"dc67c749-644a-416d-8f75-ebd340795204\") " pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.285109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55jlc\" (UniqueName: \"kubernetes.io/projected/dc67c749-644a-416d-8f75-ebd340795204-kube-api-access-55jlc\") pod \"openstack-operator-index-x4d5l\" (UID: \"dc67c749-644a-416d-8f75-ebd340795204\") " pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.302973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jlc\" (UniqueName: \"kubernetes.io/projected/dc67c749-644a-416d-8f75-ebd340795204-kube-api-access-55jlc\") pod \"openstack-operator-index-x4d5l\" (UID: \"dc67c749-644a-416d-8f75-ebd340795204\") " pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.332437 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:21:51 crc kubenswrapper[5004]: I1203 14:21:51.703670 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x4d5l"] Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.187898 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4bxt" event={"ID":"abbf21e6-2f41-4806-8512-9f01fba0afab","Type":"ContainerStarted","Data":"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de"} Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.188050 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-w4bxt" podUID="abbf21e6-2f41-4806-8512-9f01fba0afab" containerName="registry-server" containerID="cri-o://4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de" gracePeriod=2 Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.189726 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x4d5l" event={"ID":"dc67c749-644a-416d-8f75-ebd340795204","Type":"ContainerStarted","Data":"185ffee718d4169679795a01ab03c93b48771ec1fc038eacdfc77c14d6b67267"} Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.189770 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x4d5l" event={"ID":"dc67c749-644a-416d-8f75-ebd340795204","Type":"ContainerStarted","Data":"10f89d911e5bef010393766f71359ac5127aa505135c3df5e1116e0715a1a648"} Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.203190 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w4bxt" podStartSLOduration=1.399581037 podStartE2EDuration="5.203170421s" podCreationTimestamp="2025-12-03 14:21:47 +0000 UTC" firstStartedPulling="2025-12-03 14:21:48.013710401 +0000 UTC m=+920.762680637" lastFinishedPulling="2025-12-03 14:21:51.817299785 +0000 UTC m=+924.566270021" observedRunningTime="2025-12-03 14:21:52.20206718 +0000 UTC m=+924.951037416" watchObservedRunningTime="2025-12-03 14:21:52.203170421 +0000 UTC m=+924.952140657" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.222479 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x4d5l" podStartSLOduration=2.11648483 podStartE2EDuration="2.222463084s" podCreationTimestamp="2025-12-03 14:21:50 +0000 UTC" firstStartedPulling="2025-12-03 14:21:51.713254246 +0000 UTC m=+924.462224482" lastFinishedPulling="2025-12-03 14:21:51.8192325 +0000 UTC m=+924.568202736" observedRunningTime="2025-12-03 14:21:52.220744264 +0000 UTC m=+924.969714500" watchObservedRunningTime="2025-12-03 14:21:52.222463084 +0000 UTC m=+924.971433310" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.536594 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.704952 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lhcg\" (UniqueName: \"kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg\") pod \"abbf21e6-2f41-4806-8512-9f01fba0afab\" (UID: \"abbf21e6-2f41-4806-8512-9f01fba0afab\") " Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.712492 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg" (OuterVolumeSpecName: "kube-api-access-5lhcg") pod "abbf21e6-2f41-4806-8512-9f01fba0afab" (UID: "abbf21e6-2f41-4806-8512-9f01fba0afab"). InnerVolumeSpecName "kube-api-access-5lhcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.806440 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lhcg\" (UniqueName: \"kubernetes.io/projected/abbf21e6-2f41-4806-8512-9f01fba0afab-kube-api-access-5lhcg\") on node \"crc\" DevicePath \"\"" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.867961 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jksx8" Dec 03 14:21:52 crc kubenswrapper[5004]: I1203 14:21:52.984807 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-vfwkq" Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.197156 5004 generic.go:334] "Generic (PLEG): container finished" podID="abbf21e6-2f41-4806-8512-9f01fba0afab" containerID="4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de" exitCode=0 Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.197206 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4bxt" event={"ID":"abbf21e6-2f41-4806-8512-9f01fba0afab","Type":"ContainerDied","Data":"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de"} Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.197254 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4bxt" event={"ID":"abbf21e6-2f41-4806-8512-9f01fba0afab","Type":"ContainerDied","Data":"b7248403eab482fc32f3b57a440d3839931d100250359908521e696d3350f8c3"} Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.197273 5004 scope.go:117] "RemoveContainer" containerID="4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de" Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.197225 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4bxt" Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.217340 5004 scope.go:117] "RemoveContainer" containerID="4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de" Dec 03 14:21:53 crc kubenswrapper[5004]: E1203 14:21:53.217765 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de\": container with ID starting with 4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de not found: ID does not exist" containerID="4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de" Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.217801 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de"} err="failed to get container status \"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de\": rpc error: code = NotFound desc = could not find container \"4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de\": container with ID starting with 4e04f3acd11a62770a214723c90713f7deec94b8beab4e7f83eb618f33f094de not found: ID does not exist" Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.223481 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.226727 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-w4bxt"] Dec 03 14:21:53 crc kubenswrapper[5004]: I1203 14:21:53.619883 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abbf21e6-2f41-4806-8512-9f01fba0afab" path="/var/lib/kubelet/pods/abbf21e6-2f41-4806-8512-9f01fba0afab/volumes" Dec 03 14:22:01 crc kubenswrapper[5004]: I1203 14:22:01.332973 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:22:01 crc kubenswrapper[5004]: I1203 14:22:01.333534 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:22:01 crc kubenswrapper[5004]: I1203 14:22:01.363104 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:22:02 crc kubenswrapper[5004]: I1203 14:22:02.280532 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-x4d5l" Dec 03 14:22:02 crc kubenswrapper[5004]: I1203 14:22:02.861286 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9r8qk" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.054633 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8"] Dec 03 14:22:08 crc kubenswrapper[5004]: E1203 14:22:08.055570 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbf21e6-2f41-4806-8512-9f01fba0afab" containerName="registry-server" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.055588 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbf21e6-2f41-4806-8512-9f01fba0afab" containerName="registry-server" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.055716 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="abbf21e6-2f41-4806-8512-9f01fba0afab" containerName="registry-server" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.056623 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.061583 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qq45n" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.075697 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8"] Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.201455 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.201528 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.201550 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2km7z\" (UniqueName: \"kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.303158 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.303524 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2km7z\" (UniqueName: \"kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.303735 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.304108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.304606 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.322186 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2km7z\" (UniqueName: \"kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z\") pod \"7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.382047 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:08 crc kubenswrapper[5004]: I1203 14:22:08.611191 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8"] Dec 03 14:22:08 crc kubenswrapper[5004]: W1203 14:22:08.615306 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c96b2b_489e_47dd_9a49_30ee58d31916.slice/crio-afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18 WatchSource:0}: Error finding container afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18: Status 404 returned error can't find the container with id afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18 Dec 03 14:22:09 crc kubenswrapper[5004]: I1203 14:22:09.300098 5004 generic.go:334] "Generic (PLEG): container finished" podID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerID="f14527b2f329fc1676daf10114617b7a06673b2a7dd25cb9ed608068c0b57a92" exitCode=0 Dec 03 14:22:09 crc kubenswrapper[5004]: I1203 14:22:09.300217 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" event={"ID":"11c96b2b-489e-47dd-9a49-30ee58d31916","Type":"ContainerDied","Data":"f14527b2f329fc1676daf10114617b7a06673b2a7dd25cb9ed608068c0b57a92"} Dec 03 14:22:09 crc kubenswrapper[5004]: I1203 14:22:09.300456 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" event={"ID":"11c96b2b-489e-47dd-9a49-30ee58d31916","Type":"ContainerStarted","Data":"afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18"} Dec 03 14:22:10 crc kubenswrapper[5004]: I1203 14:22:10.310803 5004 generic.go:334] "Generic (PLEG): container finished" podID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerID="e396847cc0a158beffac7e8e872f8bc54c25c510912968c2eedabcef8a500e1d" exitCode=0 Dec 03 14:22:10 crc kubenswrapper[5004]: I1203 14:22:10.310887 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" event={"ID":"11c96b2b-489e-47dd-9a49-30ee58d31916","Type":"ContainerDied","Data":"e396847cc0a158beffac7e8e872f8bc54c25c510912968c2eedabcef8a500e1d"} Dec 03 14:22:11 crc kubenswrapper[5004]: I1203 14:22:11.320754 5004 generic.go:334] "Generic (PLEG): container finished" podID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerID="ae78155ee5067d981d7a578b6fc9fe8bd048fc13723dadf363cba157360c59b2" exitCode=0 Dec 03 14:22:11 crc kubenswrapper[5004]: I1203 14:22:11.320800 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" event={"ID":"11c96b2b-489e-47dd-9a49-30ee58d31916","Type":"ContainerDied","Data":"ae78155ee5067d981d7a578b6fc9fe8bd048fc13723dadf363cba157360c59b2"} Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.591421 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.675305 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2km7z\" (UniqueName: \"kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z\") pod \"11c96b2b-489e-47dd-9a49-30ee58d31916\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.675373 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util\") pod \"11c96b2b-489e-47dd-9a49-30ee58d31916\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.675410 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle\") pod \"11c96b2b-489e-47dd-9a49-30ee58d31916\" (UID: \"11c96b2b-489e-47dd-9a49-30ee58d31916\") " Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.676363 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle" (OuterVolumeSpecName: "bundle") pod "11c96b2b-489e-47dd-9a49-30ee58d31916" (UID: "11c96b2b-489e-47dd-9a49-30ee58d31916"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.684637 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z" (OuterVolumeSpecName: "kube-api-access-2km7z") pod "11c96b2b-489e-47dd-9a49-30ee58d31916" (UID: "11c96b2b-489e-47dd-9a49-30ee58d31916"). InnerVolumeSpecName "kube-api-access-2km7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.702429 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util" (OuterVolumeSpecName: "util") pod "11c96b2b-489e-47dd-9a49-30ee58d31916" (UID: "11c96b2b-489e-47dd-9a49-30ee58d31916"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.777571 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.777645 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2km7z\" (UniqueName: \"kubernetes.io/projected/11c96b2b-489e-47dd-9a49-30ee58d31916-kube-api-access-2km7z\") on node \"crc\" DevicePath \"\"" Dec 03 14:22:12 crc kubenswrapper[5004]: I1203 14:22:12.777660 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c96b2b-489e-47dd-9a49-30ee58d31916-util\") on node \"crc\" DevicePath \"\"" Dec 03 14:22:13 crc kubenswrapper[5004]: I1203 14:22:13.340876 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" event={"ID":"11c96b2b-489e-47dd-9a49-30ee58d31916","Type":"ContainerDied","Data":"afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18"} Dec 03 14:22:13 crc kubenswrapper[5004]: I1203 14:22:13.340913 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8" Dec 03 14:22:13 crc kubenswrapper[5004]: I1203 14:22:13.340920 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe725c5b9a436bfca54528fe6cc81904f8cd6da793fe944d409ad12c204ac18" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.092561 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg"] Dec 03 14:22:20 crc kubenswrapper[5004]: E1203 14:22:20.093327 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="util" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.093340 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="util" Dec 03 14:22:20 crc kubenswrapper[5004]: E1203 14:22:20.093366 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="extract" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.093372 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="extract" Dec 03 14:22:20 crc kubenswrapper[5004]: E1203 14:22:20.093390 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="pull" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.093396 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="pull" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.093511 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c96b2b-489e-47dd-9a49-30ee58d31916" containerName="extract" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.093924 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.095789 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-64hn6" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.121881 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg"] Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.178416 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtp6m\" (UniqueName: \"kubernetes.io/projected/1d0cab62-1f81-48c0-a3b3-3a774fcd7b18-kube-api-access-wtp6m\") pod \"openstack-operator-controller-operator-5d6f666fbc-smswg\" (UID: \"1d0cab62-1f81-48c0-a3b3-3a774fcd7b18\") " pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.279563 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtp6m\" (UniqueName: \"kubernetes.io/projected/1d0cab62-1f81-48c0-a3b3-3a774fcd7b18-kube-api-access-wtp6m\") pod \"openstack-operator-controller-operator-5d6f666fbc-smswg\" (UID: \"1d0cab62-1f81-48c0-a3b3-3a774fcd7b18\") " pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.299526 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtp6m\" (UniqueName: \"kubernetes.io/projected/1d0cab62-1f81-48c0-a3b3-3a774fcd7b18-kube-api-access-wtp6m\") pod \"openstack-operator-controller-operator-5d6f666fbc-smswg\" (UID: \"1d0cab62-1f81-48c0-a3b3-3a774fcd7b18\") " pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.410440 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:20 crc kubenswrapper[5004]: I1203 14:22:20.751608 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg"] Dec 03 14:22:21 crc kubenswrapper[5004]: I1203 14:22:21.390478 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" event={"ID":"1d0cab62-1f81-48c0-a3b3-3a774fcd7b18","Type":"ContainerStarted","Data":"224d5548e36b471705ddc47279ed1641c087075b661becaad74681c14f61a829"} Dec 03 14:22:27 crc kubenswrapper[5004]: I1203 14:22:27.436592 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" event={"ID":"1d0cab62-1f81-48c0-a3b3-3a774fcd7b18","Type":"ContainerStarted","Data":"bff5384f90442ae7ce82a1e301f0b88525737fffa9ade151d8720e440bb49c75"} Dec 03 14:22:27 crc kubenswrapper[5004]: I1203 14:22:27.437092 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:27 crc kubenswrapper[5004]: I1203 14:22:27.464297 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" podStartSLOduration=1.1247800589999999 podStartE2EDuration="7.464276891s" podCreationTimestamp="2025-12-03 14:22:20 +0000 UTC" firstStartedPulling="2025-12-03 14:22:20.758981149 +0000 UTC m=+953.507951385" lastFinishedPulling="2025-12-03 14:22:27.098477961 +0000 UTC m=+959.847448217" observedRunningTime="2025-12-03 14:22:27.459414052 +0000 UTC m=+960.208384288" watchObservedRunningTime="2025-12-03 14:22:27.464276891 +0000 UTC m=+960.213247127" Dec 03 14:22:40 crc kubenswrapper[5004]: I1203 14:22:40.414625 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5d6f666fbc-smswg" Dec 03 14:22:52 crc kubenswrapper[5004]: I1203 14:22:52.824372 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:22:52 crc kubenswrapper[5004]: I1203 14:22:52.824968 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.712400 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.713813 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.716437 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-x8bzg" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.730626 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.762521 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.764320 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.784274 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrr8\" (UniqueName: \"kubernetes.io/projected/34247b31-24ab-4386-8bf1-f0bfa7df6f00-kube-api-access-prrr8\") pod \"barbican-operator-controller-manager-7d9dfd778-l4b9j\" (UID: \"34247b31-24ab-4386-8bf1-f0bfa7df6f00\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.787951 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-d6p88" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.811152 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.834060 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.835062 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.838827 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-p2gsq" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.858913 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.881000 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.882188 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.887211 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ktv5z" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.887816 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrr8\" (UniqueName: \"kubernetes.io/projected/34247b31-24ab-4386-8bf1-f0bfa7df6f00-kube-api-access-prrr8\") pod \"barbican-operator-controller-manager-7d9dfd778-l4b9j\" (UID: \"34247b31-24ab-4386-8bf1-f0bfa7df6f00\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.887892 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbql\" (UniqueName: \"kubernetes.io/projected/04d75592-adf5-42b6-a02e-0074674b393d-kube-api-access-szbql\") pod \"cinder-operator-controller-manager-859b6ccc6-qffz8\" (UID: \"04d75592-adf5-42b6-a02e-0074674b393d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.887922 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8dh\" (UniqueName: \"kubernetes.io/projected/e44a1a8b-fd83-478f-9095-73e2f82ed81c-kube-api-access-vq8dh\") pod \"designate-operator-controller-manager-78b4bc895b-97hkh\" (UID: \"e44a1a8b-fd83-478f-9095-73e2f82ed81c\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.911200 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.912218 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.916604 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-zqhfp" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.931066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrr8\" (UniqueName: \"kubernetes.io/projected/34247b31-24ab-4386-8bf1-f0bfa7df6f00-kube-api-access-prrr8\") pod \"barbican-operator-controller-manager-7d9dfd778-l4b9j\" (UID: \"34247b31-24ab-4386-8bf1-f0bfa7df6f00\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.945919 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.960825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.991935 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2"] Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.993021 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.994208 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4cx\" (UniqueName: \"kubernetes.io/projected/271911f5-3a7c-448b-976d-268c5b19edc1-kube-api-access-5m4cx\") pod \"glance-operator-controller-manager-77987cd8cd-4lhqd\" (UID: \"271911f5-3a7c-448b-976d-268c5b19edc1\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.994274 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbql\" (UniqueName: \"kubernetes.io/projected/04d75592-adf5-42b6-a02e-0074674b393d-kube-api-access-szbql\") pod \"cinder-operator-controller-manager-859b6ccc6-qffz8\" (UID: \"04d75592-adf5-42b6-a02e-0074674b393d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.994299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8dh\" (UniqueName: \"kubernetes.io/projected/e44a1a8b-fd83-478f-9095-73e2f82ed81c-kube-api-access-vq8dh\") pod \"designate-operator-controller-manager-78b4bc895b-97hkh\" (UID: \"e44a1a8b-fd83-478f-9095-73e2f82ed81c\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.994323 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vsv\" (UniqueName: \"kubernetes.io/projected/9be3a985-7677-4334-b270-386feb954a5c-kube-api-access-c2vsv\") pod \"heat-operator-controller-manager-5f64f6f8bb-thtjj\" (UID: \"9be3a985-7677-4334-b270-386feb954a5c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:23:10 crc kubenswrapper[5004]: I1203 14:23:10.997344 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r7d67" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.014417 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.032314 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.032545 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8dh\" (UniqueName: \"kubernetes.io/projected/e44a1a8b-fd83-478f-9095-73e2f82ed81c-kube-api-access-vq8dh\") pod \"designate-operator-controller-manager-78b4bc895b-97hkh\" (UID: \"e44a1a8b-fd83-478f-9095-73e2f82ed81c\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.032615 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.033535 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.041810 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbql\" (UniqueName: \"kubernetes.io/projected/04d75592-adf5-42b6-a02e-0074674b393d-kube-api-access-szbql\") pod \"cinder-operator-controller-manager-859b6ccc6-qffz8\" (UID: \"04d75592-adf5-42b6-a02e-0074674b393d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.041932 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4mtfp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.041994 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.048881 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.069820 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.070836 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.085055 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-k9kfp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.090479 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.095201 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4cx\" (UniqueName: \"kubernetes.io/projected/271911f5-3a7c-448b-976d-268c5b19edc1-kube-api-access-5m4cx\") pod \"glance-operator-controller-manager-77987cd8cd-4lhqd\" (UID: \"271911f5-3a7c-448b-976d-268c5b19edc1\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.095237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdl7n\" (UniqueName: \"kubernetes.io/projected/f10a5021-1caf-47ba-8dce-51021a641f4c-kube-api-access-gdl7n\") pod \"horizon-operator-controller-manager-68c6d99b8f-lprd2\" (UID: \"f10a5021-1caf-47ba-8dce-51021a641f4c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.095257 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.095314 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mwm\" (UniqueName: \"kubernetes.io/projected/3741a6af-989d-47ac-a6ee-a6443a4f2883-kube-api-access-r4mwm\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.095335 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vsv\" (UniqueName: \"kubernetes.io/projected/9be3a985-7677-4334-b270-386feb954a5c-kube-api-access-c2vsv\") pod \"heat-operator-controller-manager-5f64f6f8bb-thtjj\" (UID: \"9be3a985-7677-4334-b270-386feb954a5c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.096204 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.098025 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.103293 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-g8bwh" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.107258 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.126521 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.134537 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4cx\" (UniqueName: \"kubernetes.io/projected/271911f5-3a7c-448b-976d-268c5b19edc1-kube-api-access-5m4cx\") pod \"glance-operator-controller-manager-77987cd8cd-4lhqd\" (UID: \"271911f5-3a7c-448b-976d-268c5b19edc1\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.138598 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vsv\" (UniqueName: \"kubernetes.io/projected/9be3a985-7677-4334-b270-386feb954a5c-kube-api-access-c2vsv\") pod \"heat-operator-controller-manager-5f64f6f8bb-thtjj\" (UID: \"9be3a985-7677-4334-b270-386feb954a5c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.146754 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.148058 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.154206 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.157737 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2lwqn" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.161297 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.162524 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.168956 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.172960 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-bkxvw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.173416 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.174430 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.184553 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-77gdd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.196736 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdl7n\" (UniqueName: \"kubernetes.io/projected/f10a5021-1caf-47ba-8dce-51021a641f4c-kube-api-access-gdl7n\") pod \"horizon-operator-controller-manager-68c6d99b8f-lprd2\" (UID: \"f10a5021-1caf-47ba-8dce-51021a641f4c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.196786 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.196850 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlrk\" (UniqueName: \"kubernetes.io/projected/3c18cd5e-8d20-4a2b-a62c-d141de1fc38a-kube-api-access-gqlrk\") pod \"keystone-operator-controller-manager-7765d96ddf-nwtth\" (UID: \"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.196911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/419e5e47-1866-473a-a668-2fee54cb76ce-kube-api-access-769jc\") pod \"ironic-operator-controller-manager-6c548fd776-6fqgd\" (UID: \"419e5e47-1866-473a-a668-2fee54cb76ce\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.196941 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mwm\" (UniqueName: \"kubernetes.io/projected/3741a6af-989d-47ac-a6ee-a6443a4f2883-kube-api-access-r4mwm\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.197312 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.197384 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert podName:3741a6af-989d-47ac-a6ee-a6443a4f2883 nodeName:}" failed. No retries permitted until 2025-12-03 14:23:11.697361648 +0000 UTC m=+1004.446331884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert") pod "infra-operator-controller-manager-57548d458d-pv9cw" (UID: "3741a6af-989d-47ac-a6ee-a6443a4f2883") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.209745 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.212437 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.231478 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdl7n\" (UniqueName: \"kubernetes.io/projected/f10a5021-1caf-47ba-8dce-51021a641f4c-kube-api-access-gdl7n\") pod \"horizon-operator-controller-manager-68c6d99b8f-lprd2\" (UID: \"f10a5021-1caf-47ba-8dce-51021a641f4c\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.233991 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.249580 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mwm\" (UniqueName: \"kubernetes.io/projected/3741a6af-989d-47ac-a6ee-a6443a4f2883-kube-api-access-r4mwm\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.260983 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.262007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.264732 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.265291 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mskbr" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.272023 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gndmp"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.273392 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.277836 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f2264" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.279517 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.300626 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2xds\" (UniqueName: \"kubernetes.io/projected/b70998ef-a4ea-49a9-922d-d7ad70346932-kube-api-access-m2xds\") pod \"manila-operator-controller-manager-7c79b5df47-4ck5g\" (UID: \"b70998ef-a4ea-49a9-922d-d7ad70346932\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.300686 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85s9w\" (UniqueName: \"kubernetes.io/projected/f35c5faa-53cc-4829-91a0-1c422eae75f6-kube-api-access-85s9w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4b4kl\" (UID: \"f35c5faa-53cc-4829-91a0-1c422eae75f6\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.300743 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n849g\" (UniqueName: \"kubernetes.io/projected/a38ab130-8698-49c3-bf30-355f88bcdc45-kube-api-access-n849g\") pod \"mariadb-operator-controller-manager-56bbcc9d85-kffh7\" (UID: \"a38ab130-8698-49c3-bf30-355f88bcdc45\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.300775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlrk\" (UniqueName: \"kubernetes.io/projected/3c18cd5e-8d20-4a2b-a62c-d141de1fc38a-kube-api-access-gqlrk\") pod \"keystone-operator-controller-manager-7765d96ddf-nwtth\" (UID: \"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.300828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/419e5e47-1866-473a-a668-2fee54cb76ce-kube-api-access-769jc\") pod \"ironic-operator-controller-manager-6c548fd776-6fqgd\" (UID: \"419e5e47-1866-473a-a668-2fee54cb76ce\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.313238 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gndmp"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.316429 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.322939 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.327569 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.332529 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ltl8s" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.345209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769jc\" (UniqueName: \"kubernetes.io/projected/419e5e47-1866-473a-a668-2fee54cb76ce-kube-api-access-769jc\") pod \"ironic-operator-controller-manager-6c548fd776-6fqgd\" (UID: \"419e5e47-1866-473a-a668-2fee54cb76ce\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.350390 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlrk\" (UniqueName: \"kubernetes.io/projected/3c18cd5e-8d20-4a2b-a62c-d141de1fc38a-kube-api-access-gqlrk\") pod \"keystone-operator-controller-manager-7765d96ddf-nwtth\" (UID: \"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.361986 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.367795 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.371370 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-lrtnf" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.371573 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.373730 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.389899 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-5z95c"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.395275 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.396780 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-5z95c"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.398262 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bnm9g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.402915 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75wr\" (UniqueName: \"kubernetes.io/projected/2aab9a50-58d3-4eba-8589-c009d3b2b604-kube-api-access-n75wr\") pod \"ovn-operator-controller-manager-b6456fdb6-td552\" (UID: \"2aab9a50-58d3-4eba-8589-c009d3b2b604\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.402981 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2xds\" (UniqueName: \"kubernetes.io/projected/b70998ef-a4ea-49a9-922d-d7ad70346932-kube-api-access-m2xds\") pod \"manila-operator-controller-manager-7c79b5df47-4ck5g\" (UID: \"b70998ef-a4ea-49a9-922d-d7ad70346932\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.403002 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ptr\" (UniqueName: \"kubernetes.io/projected/dd7dec16-458d-46f6-9ee6-b0db6551792a-kube-api-access-59ptr\") pod \"nova-operator-controller-manager-697bc559fc-st4w4\" (UID: \"dd7dec16-458d-46f6-9ee6-b0db6551792a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.403019 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ncr4\" (UniqueName: \"kubernetes.io/projected/a1d5cb2a-85a6-4ff0-a9cf-519397479d2c-kube-api-access-2ncr4\") pod \"octavia-operator-controller-manager-998648c74-gndmp\" (UID: \"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.403038 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85s9w\" (UniqueName: \"kubernetes.io/projected/f35c5faa-53cc-4829-91a0-1c422eae75f6-kube-api-access-85s9w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4b4kl\" (UID: \"f35c5faa-53cc-4829-91a0-1c422eae75f6\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.403071 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n849g\" (UniqueName: \"kubernetes.io/projected/a38ab130-8698-49c3-bf30-355f88bcdc45-kube-api-access-n849g\") pod \"mariadb-operator-controller-manager-56bbcc9d85-kffh7\" (UID: \"a38ab130-8698-49c3-bf30-355f88bcdc45\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.447838 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85s9w\" (UniqueName: \"kubernetes.io/projected/f35c5faa-53cc-4829-91a0-1c422eae75f6-kube-api-access-85s9w\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-4b4kl\" (UID: \"f35c5faa-53cc-4829-91a0-1c422eae75f6\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.449499 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2xds\" (UniqueName: \"kubernetes.io/projected/b70998ef-a4ea-49a9-922d-d7ad70346932-kube-api-access-m2xds\") pod \"manila-operator-controller-manager-7c79b5df47-4ck5g\" (UID: \"b70998ef-a4ea-49a9-922d-d7ad70346932\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.460190 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.460563 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.472538 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n849g\" (UniqueName: \"kubernetes.io/projected/a38ab130-8698-49c3-bf30-355f88bcdc45-kube-api-access-n849g\") pod \"mariadb-operator-controller-manager-56bbcc9d85-kffh7\" (UID: \"a38ab130-8698-49c3-bf30-355f88bcdc45\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.475642 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.476815 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.482133 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.482577 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.487180 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-98kcx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.501585 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505201 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-swddq" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75wr\" (UniqueName: \"kubernetes.io/projected/2aab9a50-58d3-4eba-8589-c009d3b2b604-kube-api-access-n75wr\") pod \"ovn-operator-controller-manager-b6456fdb6-td552\" (UID: \"2aab9a50-58d3-4eba-8589-c009d3b2b604\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505766 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkx2q\" (UniqueName: \"kubernetes.io/projected/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-kube-api-access-rkx2q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505798 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505873 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59ptr\" (UniqueName: \"kubernetes.io/projected/dd7dec16-458d-46f6-9ee6-b0db6551792a-kube-api-access-59ptr\") pod \"nova-operator-controller-manager-697bc559fc-st4w4\" (UID: \"dd7dec16-458d-46f6-9ee6-b0db6551792a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ncr4\" (UniqueName: \"kubernetes.io/projected/a1d5cb2a-85a6-4ff0-a9cf-519397479d2c-kube-api-access-2ncr4\") pod \"octavia-operator-controller-manager-998648c74-gndmp\" (UID: \"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.505979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8pk\" (UniqueName: \"kubernetes.io/projected/90f6b1a6-2cd1-4649-b794-e00f64cd80cb-kube-api-access-fb8pk\") pod \"placement-operator-controller-manager-78f8948974-5z95c\" (UID: \"90f6b1a6-2cd1-4649-b794-e00f64cd80cb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.508311 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.520248 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.520535 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.536998 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75wr\" (UniqueName: \"kubernetes.io/projected/2aab9a50-58d3-4eba-8589-c009d3b2b604-kube-api-access-n75wr\") pod \"ovn-operator-controller-manager-b6456fdb6-td552\" (UID: \"2aab9a50-58d3-4eba-8589-c009d3b2b604\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.538239 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-92pmx"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.539624 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.541170 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ptr\" (UniqueName: \"kubernetes.io/projected/dd7dec16-458d-46f6-9ee6-b0db6551792a-kube-api-access-59ptr\") pod \"nova-operator-controller-manager-697bc559fc-st4w4\" (UID: \"dd7dec16-458d-46f6-9ee6-b0db6551792a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.549095 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.556432 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5xg7n" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.559077 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-92pmx"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.561778 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.571566 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ncr4\" (UniqueName: \"kubernetes.io/projected/a1d5cb2a-85a6-4ff0-a9cf-519397479d2c-kube-api-access-2ncr4\") pod \"octavia-operator-controller-manager-998648c74-gndmp\" (UID: \"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.575975 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.581335 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.588766 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-79n99" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.592416 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.606968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9t52\" (UniqueName: \"kubernetes.io/projected/206c7f05-3575-400e-a37b-ba608f159fc5-kube-api-access-v9t52\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8bnn2\" (UID: \"206c7f05-3575-400e-a37b-ba608f159fc5\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.607048 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkx2q\" (UniqueName: \"kubernetes.io/projected/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-kube-api-access-rkx2q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.607082 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.607795 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.607900 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert podName:7dbdc2c5-5e0c-4315-b836-1acacf93df2d nodeName:}" failed. No retries permitted until 2025-12-03 14:23:12.10788269 +0000 UTC m=+1004.856852926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" (UID: "7dbdc2c5-5e0c-4315-b836-1acacf93df2d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.608099 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hml\" (UniqueName: \"kubernetes.io/projected/bf9d689f-bfab-4b05-9b08-d855836a7846-kube-api-access-65hml\") pod \"test-operator-controller-manager-5854674fcc-92pmx\" (UID: \"bf9d689f-bfab-4b05-9b08-d855836a7846\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.610031 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6bfs\" (UniqueName: \"kubernetes.io/projected/9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b-kube-api-access-p6bfs\") pod \"swift-operator-controller-manager-5f8c65bbfc-8xwrm\" (UID: \"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.610192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8pk\" (UniqueName: \"kubernetes.io/projected/90f6b1a6-2cd1-4649-b794-e00f64cd80cb-kube-api-access-fb8pk\") pod \"placement-operator-controller-manager-78f8948974-5z95c\" (UID: \"90f6b1a6-2cd1-4649-b794-e00f64cd80cb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.631166 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.635436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8pk\" (UniqueName: \"kubernetes.io/projected/90f6b1a6-2cd1-4649-b794-e00f64cd80cb-kube-api-access-fb8pk\") pod \"placement-operator-controller-manager-78f8948974-5z95c\" (UID: \"90f6b1a6-2cd1-4649-b794-e00f64cd80cb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.635954 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkx2q\" (UniqueName: \"kubernetes.io/projected/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-kube-api-access-rkx2q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.637437 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.637626 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.639547 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.640797 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.645108 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xlk5f" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.645794 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.647077 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.652484 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.660628 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-lhf9f" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.704633 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.712164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.712228 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67fgg\" (UniqueName: \"kubernetes.io/projected/65687c7c-1b6d-485f-b99c-41706846c7a7-kube-api-access-67fgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gzd52\" (UID: \"65687c7c-1b6d-485f-b99c-41706846c7a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.712294 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.715005 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w952x\" (UniqueName: \"kubernetes.io/projected/298e9f66-a005-42bd-b2f6-4653a88e0177-kube-api-access-w952x\") pod \"watcher-operator-controller-manager-769dc69bc-rlxc5\" (UID: \"298e9f66-a005-42bd-b2f6-4653a88e0177\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.715177 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9t52\" (UniqueName: \"kubernetes.io/projected/206c7f05-3575-400e-a37b-ba608f159fc5-kube-api-access-v9t52\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8bnn2\" (UID: \"206c7f05-3575-400e-a37b-ba608f159fc5\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.716830 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krx2s\" (UniqueName: \"kubernetes.io/projected/a3fd1093-3e64-4558-9314-355dbf1c8a8c-kube-api-access-krx2s\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.716920 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.717649 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.717768 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert podName:3741a6af-989d-47ac-a6ee-a6443a4f2883 nodeName:}" failed. No retries permitted until 2025-12-03 14:23:12.717734775 +0000 UTC m=+1005.466705021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert") pod "infra-operator-controller-manager-57548d458d-pv9cw" (UID: "3741a6af-989d-47ac-a6ee-a6443a4f2883") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.731420 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.739921 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hml\" (UniqueName: \"kubernetes.io/projected/bf9d689f-bfab-4b05-9b08-d855836a7846-kube-api-access-65hml\") pod \"test-operator-controller-manager-5854674fcc-92pmx\" (UID: \"bf9d689f-bfab-4b05-9b08-d855836a7846\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.740198 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6bfs\" (UniqueName: \"kubernetes.io/projected/9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b-kube-api-access-p6bfs\") pod \"swift-operator-controller-manager-5f8c65bbfc-8xwrm\" (UID: \"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.746233 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9t52\" (UniqueName: \"kubernetes.io/projected/206c7f05-3575-400e-a37b-ba608f159fc5-kube-api-access-v9t52\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8bnn2\" (UID: \"206c7f05-3575-400e-a37b-ba608f159fc5\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.751709 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.757982 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hml\" (UniqueName: \"kubernetes.io/projected/bf9d689f-bfab-4b05-9b08-d855836a7846-kube-api-access-65hml\") pod \"test-operator-controller-manager-5854674fcc-92pmx\" (UID: \"bf9d689f-bfab-4b05-9b08-d855836a7846\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.767271 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6bfs\" (UniqueName: \"kubernetes.io/projected/9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b-kube-api-access-p6bfs\") pod \"swift-operator-controller-manager-5f8c65bbfc-8xwrm\" (UID: \"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.796521 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j"] Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.841487 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.841536 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w952x\" (UniqueName: \"kubernetes.io/projected/298e9f66-a005-42bd-b2f6-4653a88e0177-kube-api-access-w952x\") pod \"watcher-operator-controller-manager-769dc69bc-rlxc5\" (UID: \"298e9f66-a005-42bd-b2f6-4653a88e0177\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.841602 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krx2s\" (UniqueName: \"kubernetes.io/projected/a3fd1093-3e64-4558-9314-355dbf1c8a8c-kube-api-access-krx2s\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.841646 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.841668 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67fgg\" (UniqueName: \"kubernetes.io/projected/65687c7c-1b6d-485f-b99c-41706846c7a7-kube-api-access-67fgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gzd52\" (UID: \"65687c7c-1b6d-485f-b99c-41706846c7a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.842047 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.842095 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:12.342079505 +0000 UTC m=+1005.091049741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.842518 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: E1203 14:23:11.842552 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:12.342541518 +0000 UTC m=+1005.091511754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.858032 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67fgg\" (UniqueName: \"kubernetes.io/projected/65687c7c-1b6d-485f-b99c-41706846c7a7-kube-api-access-67fgg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gzd52\" (UID: \"65687c7c-1b6d-485f-b99c-41706846c7a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.863439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krx2s\" (UniqueName: \"kubernetes.io/projected/a3fd1093-3e64-4558-9314-355dbf1c8a8c-kube-api-access-krx2s\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.877789 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w952x\" (UniqueName: \"kubernetes.io/projected/298e9f66-a005-42bd-b2f6-4653a88e0177-kube-api-access-w952x\") pod \"watcher-operator-controller-manager-769dc69bc-rlxc5\" (UID: \"298e9f66-a005-42bd-b2f6-4653a88e0177\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.878086 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.922383 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.925430 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.958885 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:23:11 crc kubenswrapper[5004]: I1203 14:23:11.975327 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.030219 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.042072 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.048669 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.088842 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.149170 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.149351 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.149415 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert podName:7dbdc2c5-5e0c-4315-b836-1acacf93df2d nodeName:}" failed. No retries permitted until 2025-12-03 14:23:13.149397063 +0000 UTC m=+1005.898367299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" (UID: "7dbdc2c5-5e0c-4315-b836-1acacf93df2d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.226981 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.256408 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.266325 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.276553 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.353717 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.353812 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.353972 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.354029 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:13.35401129 +0000 UTC m=+1006.102981526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.354405 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.354434 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:13.354424262 +0000 UTC m=+1006.103394508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.432127 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.634486 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gndmp"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.644369 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.664097 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-5z95c"] Dec 03 14:23:12 crc kubenswrapper[5004]: W1203 14:23:12.686601 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90f6b1a6_2cd1_4649_b794_e00f64cd80cb.slice/crio-9543706871ddfdc47573c7e71d6428d887e72a3f3eb89b7d469b709a8eb186c4 WatchSource:0}: Error finding container 9543706871ddfdc47573c7e71d6428d887e72a3f3eb89b7d469b709a8eb186c4: Status 404 returned error can't find the container with id 9543706871ddfdc47573c7e71d6428d887e72a3f3eb89b7d469b709a8eb186c4 Dec 03 14:23:12 crc kubenswrapper[5004]: W1203 14:23:12.695696 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd7dec16_458d_46f6_9ee6_b0db6551792a.slice/crio-6a983070d66a3809bf9322ba2683152da1b1b0c391963c9678def60a6bab2dd2 WatchSource:0}: Error finding container 6a983070d66a3809bf9322ba2683152da1b1b0c391963c9678def60a6bab2dd2: Status 404 returned error can't find the container with id 6a983070d66a3809bf9322ba2683152da1b1b0c391963c9678def60a6bab2dd2 Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.695747 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.705374 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.753064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" event={"ID":"04d75592-adf5-42b6-a02e-0074674b393d","Type":"ContainerStarted","Data":"91f83bc24aec662e67da867359937605cd1bdcc7cde4f85c66a4a48149e47622"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.756817 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" event={"ID":"b70998ef-a4ea-49a9-922d-d7ad70346932","Type":"ContainerStarted","Data":"06c181693fc598c78da1e45d07175042499b2a93049268f9697cda871dccd395"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.758056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" event={"ID":"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c","Type":"ContainerStarted","Data":"704e51da203a361278f4d8d60e584ea2bd40469ce3d185157698320c4b9d178d"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.759289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" event={"ID":"e44a1a8b-fd83-478f-9095-73e2f82ed81c","Type":"ContainerStarted","Data":"2f7b74d16321a74c7d49f765af7483eb62f0cec5922fb73dbc88e037af57c301"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.761925 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" event={"ID":"271911f5-3a7c-448b-976d-268c5b19edc1","Type":"ContainerStarted","Data":"a5bbe73e131e88801d95e72c7604c0437f01c2211933b55177db6bbf4771dc7e"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.764339 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.764551 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.764691 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert podName:3741a6af-989d-47ac-a6ee-a6443a4f2883 nodeName:}" failed. No retries permitted until 2025-12-03 14:23:14.764669256 +0000 UTC m=+1007.513639492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert") pod "infra-operator-controller-manager-57548d458d-pv9cw" (UID: "3741a6af-989d-47ac-a6ee-a6443a4f2883") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.768415 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" event={"ID":"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a","Type":"ContainerStarted","Data":"fff75b35f02e72c1c1005d2331706808e030ccf1ee848399590efaf75a43591d"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.775509 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" event={"ID":"90f6b1a6-2cd1-4649-b794-e00f64cd80cb","Type":"ContainerStarted","Data":"9543706871ddfdc47573c7e71d6428d887e72a3f3eb89b7d469b709a8eb186c4"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.776533 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" event={"ID":"419e5e47-1866-473a-a668-2fee54cb76ce","Type":"ContainerStarted","Data":"324c042c0c318598797a8210f58325fa8513aaa2827f53d34755ee1c00514b03"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.777199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" event={"ID":"f35c5faa-53cc-4829-91a0-1c422eae75f6","Type":"ContainerStarted","Data":"fa5055f836f40dc18fdb225b02097d88ac4994d762bbf8206cc02aef62f21b87"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.777969 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" event={"ID":"9be3a985-7677-4334-b270-386feb954a5c","Type":"ContainerStarted","Data":"54c93d81ccff01d8236c36c20301c38c070ef58401725d3117047d348cc3ac81"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.778873 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" event={"ID":"dd7dec16-458d-46f6-9ee6-b0db6551792a","Type":"ContainerStarted","Data":"6a983070d66a3809bf9322ba2683152da1b1b0c391963c9678def60a6bab2dd2"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.780179 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" event={"ID":"a38ab130-8698-49c3-bf30-355f88bcdc45","Type":"ContainerStarted","Data":"65ebeee3f01f40f9241ca8209aebc4e17b1534d0892658cb4d30df2b87645bdc"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.782489 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" event={"ID":"f10a5021-1caf-47ba-8dce-51021a641f4c","Type":"ContainerStarted","Data":"57097f7f172153e99137c30f0fefc96cafb931efa90830103643d34140882064"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.783694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" event={"ID":"34247b31-24ab-4386-8bf1-f0bfa7df6f00","Type":"ContainerStarted","Data":"40e5ada7afdc9495303d2b4c7fe442432f725446446f59595a43d54e446796fc"} Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.826364 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552"] Dec 03 14:23:12 crc kubenswrapper[5004]: W1203 14:23:12.833468 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aab9a50_58d3_4eba_8589_c009d3b2b604.slice/crio-d346334b8021f3ec7f79fa32dd64c4489a42b852100ee24a85c83a76872887bc WatchSource:0}: Error finding container d346334b8021f3ec7f79fa32dd64c4489a42b852100ee24a85c83a76872887bc: Status 404 returned error can't find the container with id d346334b8021f3ec7f79fa32dd64c4489a42b852100ee24a85c83a76872887bc Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.839341 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n75wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-td552_openstack-operators(2aab9a50-58d3-4eba-8589-c009d3b2b604): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.842289 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n75wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-td552_openstack-operators(2aab9a50-58d3-4eba-8589-c009d3b2b604): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.843628 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.923128 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.952170 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2"] Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.960509 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67fgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gzd52_openstack-operators(65687c7c-1b6d-485f-b99c-41706846c7a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.962439 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" podUID="65687c7c-1b6d-485f-b99c-41706846c7a7" Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.966021 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52"] Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.974396 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm"] Dec 03 14:23:12 crc kubenswrapper[5004]: W1203 14:23:12.981617 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206c7f05_3575_400e_a37b_ba608f159fc5.slice/crio-a4d9d1a735cd90f07de2fb4be5164c864f533434a11cb21fd4212eb4aa016ee2 WatchSource:0}: Error finding container a4d9d1a735cd90f07de2fb4be5164c864f533434a11cb21fd4212eb4aa016ee2: Status 404 returned error can't find the container with id a4d9d1a735cd90f07de2fb4be5164c864f533434a11cb21fd4212eb4aa016ee2 Dec 03 14:23:12 crc kubenswrapper[5004]: I1203 14:23:12.986479 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-92pmx"] Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.986763 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v9t52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-8bnn2_openstack-operators(206c7f05-3575-400e-a37b-ba608f159fc5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.988884 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v9t52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-8bnn2_openstack-operators(206c7f05-3575-400e-a37b-ba608f159fc5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:12 crc kubenswrapper[5004]: E1203 14:23:12.990253 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" podUID="206c7f05-3575-400e-a37b-ba608f159fc5" Dec 03 14:23:12 crc kubenswrapper[5004]: W1203 14:23:12.996583 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9d689f_bfab_4b05_9b08_d855836a7846.slice/crio-29a899aa041067656de24517f5ffe2c63cb60198da98b92be5bc1ed91982b685 WatchSource:0}: Error finding container 29a899aa041067656de24517f5ffe2c63cb60198da98b92be5bc1ed91982b685: Status 404 returned error can't find the container with id 29a899aa041067656de24517f5ffe2c63cb60198da98b92be5bc1ed91982b685 Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.006167 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-65hml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-92pmx_openstack-operators(bf9d689f-bfab-4b05-9b08-d855836a7846): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.010313 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-65hml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-92pmx_openstack-operators(bf9d689f-bfab-4b05-9b08-d855836a7846): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.012221 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" podUID="bf9d689f-bfab-4b05-9b08-d855836a7846" Dec 03 14:23:13 crc kubenswrapper[5004]: W1203 14:23:13.026636 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bec5a93_cc9c_4f46_8ecc_dcdde9f9023b.slice/crio-8f2ebc455972d139bfc367738a608e3cd5feddca3264754bf449c67caa9bcf8a WatchSource:0}: Error finding container 8f2ebc455972d139bfc367738a608e3cd5feddca3264754bf449c67caa9bcf8a: Status 404 returned error can't find the container with id 8f2ebc455972d139bfc367738a608e3cd5feddca3264754bf449c67caa9bcf8a Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.036443 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6bfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-8xwrm_openstack-operators(9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.039779 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6bfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-8xwrm_openstack-operators(9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.041013 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" podUID="9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b" Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.174250 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.174486 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.174531 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert podName:7dbdc2c5-5e0c-4315-b836-1acacf93df2d nodeName:}" failed. No retries permitted until 2025-12-03 14:23:15.174516379 +0000 UTC m=+1007.923486615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" (UID: "7dbdc2c5-5e0c-4315-b836-1acacf93df2d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.376977 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.377056 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.377147 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.377218 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:15.377198691 +0000 UTC m=+1008.126168927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.377239 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.377314 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:15.377292754 +0000 UTC m=+1008.126262990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.796601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" event={"ID":"298e9f66-a005-42bd-b2f6-4653a88e0177","Type":"ContainerStarted","Data":"b3311d71d6aa4be220ae5968f89ae3b2d1616166667499c4a5e22b39cd93becf"} Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.799322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" event={"ID":"206c7f05-3575-400e-a37b-ba608f159fc5","Type":"ContainerStarted","Data":"a4d9d1a735cd90f07de2fb4be5164c864f533434a11cb21fd4212eb4aa016ee2"} Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.809689 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" podUID="206c7f05-3575-400e-a37b-ba608f159fc5" Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.830868 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" event={"ID":"65687c7c-1b6d-485f-b99c-41706846c7a7","Type":"ContainerStarted","Data":"a53beeeb299a3962b6abff64959440f90fdd3df2b676b8bd943e7141add963e2"} Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.833825 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" podUID="65687c7c-1b6d-485f-b99c-41706846c7a7" Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.836431 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" event={"ID":"bf9d689f-bfab-4b05-9b08-d855836a7846","Type":"ContainerStarted","Data":"29a899aa041067656de24517f5ffe2c63cb60198da98b92be5bc1ed91982b685"} Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.840357 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" podUID="bf9d689f-bfab-4b05-9b08-d855836a7846" Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.843963 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" event={"ID":"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b","Type":"ContainerStarted","Data":"8f2ebc455972d139bfc367738a608e3cd5feddca3264754bf449c67caa9bcf8a"} Dec 03 14:23:13 crc kubenswrapper[5004]: I1203 14:23:13.847699 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" event={"ID":"2aab9a50-58d3-4eba-8589-c009d3b2b604","Type":"ContainerStarted","Data":"d346334b8021f3ec7f79fa32dd64c4489a42b852100ee24a85c83a76872887bc"} Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.848530 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" podUID="9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b" Dec 03 14:23:13 crc kubenswrapper[5004]: E1203 14:23:13.849509 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:23:14 crc kubenswrapper[5004]: I1203 14:23:14.808072 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.808272 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.808536 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert podName:3741a6af-989d-47ac-a6ee-a6443a4f2883 nodeName:}" failed. No retries permitted until 2025-12-03 14:23:18.808516565 +0000 UTC m=+1011.557486801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert") pod "infra-operator-controller-manager-57548d458d-pv9cw" (UID: "3741a6af-989d-47ac-a6ee-a6443a4f2883") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.856354 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" podUID="65687c7c-1b6d-485f-b99c-41706846c7a7" Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.858211 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" podUID="206c7f05-3575-400e-a37b-ba608f159fc5" Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.858387 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.858300 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" podUID="bf9d689f-bfab-4b05-9b08-d855836a7846" Dec 03 14:23:14 crc kubenswrapper[5004]: E1203 14:23:14.859792 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" podUID="9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b" Dec 03 14:23:15 crc kubenswrapper[5004]: I1203 14:23:15.215740 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.215936 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.216004 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert podName:7dbdc2c5-5e0c-4315-b836-1acacf93df2d nodeName:}" failed. No retries permitted until 2025-12-03 14:23:19.2159855 +0000 UTC m=+1011.964955736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" (UID: "7dbdc2c5-5e0c-4315-b836-1acacf93df2d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:15 crc kubenswrapper[5004]: I1203 14:23:15.418809 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:15 crc kubenswrapper[5004]: I1203 14:23:15.418906 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.419118 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.419178 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:19.419160616 +0000 UTC m=+1012.168130852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.419402 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:15 crc kubenswrapper[5004]: E1203 14:23:15.419479 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:19.419460405 +0000 UTC m=+1012.168430701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:18 crc kubenswrapper[5004]: I1203 14:23:18.864744 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:18 crc kubenswrapper[5004]: E1203 14:23:18.865238 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:18 crc kubenswrapper[5004]: E1203 14:23:18.865284 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert podName:3741a6af-989d-47ac-a6ee-a6443a4f2883 nodeName:}" failed. No retries permitted until 2025-12-03 14:23:26.865269378 +0000 UTC m=+1019.614239614 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert") pod "infra-operator-controller-manager-57548d458d-pv9cw" (UID: "3741a6af-989d-47ac-a6ee-a6443a4f2883") : secret "infra-operator-webhook-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: I1203 14:23:19.271393 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.271539 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.271618 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert podName:7dbdc2c5-5e0c-4315-b836-1acacf93df2d nodeName:}" failed. No retries permitted until 2025-12-03 14:23:27.2716013 +0000 UTC m=+1020.020571536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" (UID: "7dbdc2c5-5e0c-4315-b836-1acacf93df2d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: I1203 14:23:19.480224 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:19 crc kubenswrapper[5004]: I1203 14:23:19.480650 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.480390 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.480737 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:27.480710696 +0000 UTC m=+1020.229680932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.480847 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:19 crc kubenswrapper[5004]: E1203 14:23:19.480934 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:27.480912282 +0000 UTC m=+1020.229882518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:22 crc kubenswrapper[5004]: I1203 14:23:22.823845 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:23:22 crc kubenswrapper[5004]: I1203 14:23:22.824257 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:23:25 crc kubenswrapper[5004]: E1203 14:23:25.284957 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 03 14:23:25 crc kubenswrapper[5004]: E1203 14:23:25.285515 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5m4cx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-4lhqd_openstack-operators(271911f5-3a7c-448b-976d-268c5b19edc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:23:26 crc kubenswrapper[5004]: E1203 14:23:26.396025 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 03 14:23:26 crc kubenswrapper[5004]: E1203 14:23:26.396589 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-szbql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-qffz8_openstack-operators(04d75592-adf5-42b6-a02e-0074674b393d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:23:26 crc kubenswrapper[5004]: I1203 14:23:26.885487 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:26 crc kubenswrapper[5004]: I1203 14:23:26.890747 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3741a6af-989d-47ac-a6ee-a6443a4f2883-cert\") pod \"infra-operator-controller-manager-57548d458d-pv9cw\" (UID: \"3741a6af-989d-47ac-a6ee-a6443a4f2883\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.016143 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.290901 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.297483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dbdc2c5-5e0c-4315-b836-1acacf93df2d-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd46zspw\" (UID: \"7dbdc2c5-5e0c-4315-b836-1acacf93df2d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.389626 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.493949 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:27 crc kubenswrapper[5004]: E1203 14:23:27.494202 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 14:23:27 crc kubenswrapper[5004]: I1203 14:23:27.494204 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:27 crc kubenswrapper[5004]: E1203 14:23:27.494345 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:43.494316452 +0000 UTC m=+1036.243286688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "webhook-server-cert" not found Dec 03 14:23:27 crc kubenswrapper[5004]: E1203 14:23:27.494380 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 14:23:27 crc kubenswrapper[5004]: E1203 14:23:27.494575 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs podName:a3fd1093-3e64-4558-9314-355dbf1c8a8c nodeName:}" failed. No retries permitted until 2025-12-03 14:23:43.494547468 +0000 UTC m=+1036.243517704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs") pod "openstack-operator-controller-manager-79c58f7d4-4qmpw" (UID: "a3fd1093-3e64-4558-9314-355dbf1c8a8c") : secret "metrics-server-cert" not found Dec 03 14:23:41 crc kubenswrapper[5004]: E1203 14:23:41.548353 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 03 14:23:41 crc kubenswrapper[5004]: E1203 14:23:41.549005 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n849g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-kffh7_openstack-operators(a38ab130-8698-49c3-bf30-355f88bcdc45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.541060 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.541180 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.551755 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-webhook-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.551782 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3fd1093-3e64-4558-9314-355dbf1c8a8c-metrics-certs\") pod \"openstack-operator-controller-manager-79c58f7d4-4qmpw\" (UID: \"a3fd1093-3e64-4558-9314-355dbf1c8a8c\") " pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.578385 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xlk5f" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.588582 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:43 crc kubenswrapper[5004]: I1203 14:23:43.975476 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw"] Dec 03 14:23:44 crc kubenswrapper[5004]: W1203 14:23:44.060709 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dbdc2c5_5e0c_4315_b836_1acacf93df2d.slice/crio-87b80f67660383e64f8dba84cf1bef63466c8a4d07f2c9a789329862479cf661 WatchSource:0}: Error finding container 87b80f67660383e64f8dba84cf1bef63466c8a4d07f2c9a789329862479cf661: Status 404 returned error can't find the container with id 87b80f67660383e64f8dba84cf1bef63466c8a4d07f2c9a789329862479cf661 Dec 03 14:23:44 crc kubenswrapper[5004]: I1203 14:23:44.090096 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" event={"ID":"7dbdc2c5-5e0c-4315-b836-1acacf93df2d","Type":"ContainerStarted","Data":"87b80f67660383e64f8dba84cf1bef63466c8a4d07f2c9a789329862479cf661"} Dec 03 14:23:44 crc kubenswrapper[5004]: I1203 14:23:44.250750 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw"] Dec 03 14:23:44 crc kubenswrapper[5004]: W1203 14:23:44.276943 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3741a6af_989d_47ac_a6ee_a6443a4f2883.slice/crio-43fcb23d88cf696efc23c471f1e341a586f0ad6b381987b57f4b64e45422a0c9 WatchSource:0}: Error finding container 43fcb23d88cf696efc23c471f1e341a586f0ad6b381987b57f4b64e45422a0c9: Status 404 returned error can't find the container with id 43fcb23d88cf696efc23c471f1e341a586f0ad6b381987b57f4b64e45422a0c9 Dec 03 14:23:44 crc kubenswrapper[5004]: I1203 14:23:44.326949 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw"] Dec 03 14:23:44 crc kubenswrapper[5004]: E1203 14:23:44.785832 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-prrr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-l4b9j_openstack-operators(34247b31-24ab-4386-8bf1-f0bfa7df6f00): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 14:23:44 crc kubenswrapper[5004]: E1203 14:23:44.787166 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" podUID="34247b31-24ab-4386-8bf1-f0bfa7df6f00" Dec 03 14:23:44 crc kubenswrapper[5004]: E1203 14:23:44.913626 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 14:23:44 crc kubenswrapper[5004]: E1203 14:23:44.914285 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gqlrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-nwtth_openstack-operators(3c18cd5e-8d20-4a2b-a62c-d141de1fc38a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.135619 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" event={"ID":"e44a1a8b-fd83-478f-9095-73e2f82ed81c","Type":"ContainerStarted","Data":"d3be8e263597ee3705a0b9bf6c7f707217b87f2435e7673f0e5563cb6b9cd0e3"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.143038 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" event={"ID":"298e9f66-a005-42bd-b2f6-4653a88e0177","Type":"ContainerStarted","Data":"4282f4b869dbe0515c7349c545a3471439f602571658929ef8d947c46b39ff87"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.146177 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" event={"ID":"3741a6af-989d-47ac-a6ee-a6443a4f2883","Type":"ContainerStarted","Data":"43fcb23d88cf696efc23c471f1e341a586f0ad6b381987b57f4b64e45422a0c9"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.147607 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" event={"ID":"9be3a985-7677-4334-b270-386feb954a5c","Type":"ContainerStarted","Data":"37dbca63237c7fd295a2f36640199a7471ec49f6a8765998135b95b93d795b78"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.149134 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" event={"ID":"90f6b1a6-2cd1-4649-b794-e00f64cd80cb","Type":"ContainerStarted","Data":"e5939264f88d6112eb58c9c4ea5829aab8ee64939763a47bffb530c82a326a4f"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.159446 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" event={"ID":"f35c5faa-53cc-4829-91a0-1c422eae75f6","Type":"ContainerStarted","Data":"035756e672232afa8246e7d148be56c617c98e8096b41b75d293640f99353569"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.162514 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" event={"ID":"dd7dec16-458d-46f6-9ee6-b0db6551792a","Type":"ContainerStarted","Data":"944b2b5a9e7829e54cb91c705315231b2d3218d830418088a3aa4e67f5fb5894"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.176402 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" event={"ID":"34247b31-24ab-4386-8bf1-f0bfa7df6f00","Type":"ContainerStarted","Data":"94c3e2dd81706e4f183cdb75bc08ab631c370c6d330c24a4a38a18c07cd7f2bb"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.177314 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:45 crc kubenswrapper[5004]: E1203 14:23:45.178460 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" podUID="34247b31-24ab-4386-8bf1-f0bfa7df6f00" Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.181429 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" event={"ID":"419e5e47-1866-473a-a668-2fee54cb76ce","Type":"ContainerStarted","Data":"4338adad25852b93f6a41cadc2618df21c075668afc642ce1a5a42984fcd51e8"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.223164 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" event={"ID":"b70998ef-a4ea-49a9-922d-d7ad70346932","Type":"ContainerStarted","Data":"3d15cc21eda5f6550632312338ee59dab55dde1fbbd0343fdd193b8d67d652e1"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.233181 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" event={"ID":"f10a5021-1caf-47ba-8dce-51021a641f4c","Type":"ContainerStarted","Data":"2ef7c387ba8ce2312086592b1e0bb5ec1c0ee93aa8282cea3f03286a01a8e4ee"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.247308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" event={"ID":"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c","Type":"ContainerStarted","Data":"85ea75d36cb6c4d85150b75ac75d9f88140ecd91be13f85499aad0fa9f2ec851"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.253129 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" event={"ID":"a3fd1093-3e64-4558-9314-355dbf1c8a8c","Type":"ContainerStarted","Data":"766e489c901e4a0f676fd2329e647590094d527436c8e214c7891d78ac968938"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.253182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" event={"ID":"a3fd1093-3e64-4558-9314-355dbf1c8a8c","Type":"ContainerStarted","Data":"880ec67fda58201e62f89ad77455afd263c8dfa3750b74b4fd57ca0bb4c2f056"} Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.254000 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:45 crc kubenswrapper[5004]: I1203 14:23:45.282632 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" podStartSLOduration=34.282615148 podStartE2EDuration="34.282615148s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:23:45.281438654 +0000 UTC m=+1038.030408910" watchObservedRunningTime="2025-12-03 14:23:45.282615148 +0000 UTC m=+1038.031585384" Dec 03 14:23:46 crc kubenswrapper[5004]: E1203 14:23:46.292090 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" podUID="34247b31-24ab-4386-8bf1-f0bfa7df6f00" Dec 03 14:23:51 crc kubenswrapper[5004]: I1203 14:23:51.036381 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" Dec 03 14:23:51 crc kubenswrapper[5004]: E1203 14:23:51.040498 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" podUID="34247b31-24ab-4386-8bf1-f0bfa7df6f00" Dec 03 14:23:52 crc kubenswrapper[5004]: I1203 14:23:52.824450 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:23:52 crc kubenswrapper[5004]: I1203 14:23:52.824535 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:23:52 crc kubenswrapper[5004]: I1203 14:23:52.824580 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:23:52 crc kubenswrapper[5004]: I1203 14:23:52.825246 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:23:52 crc kubenswrapper[5004]: I1203 14:23:52.825298 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6" gracePeriod=600 Dec 03 14:23:53 crc kubenswrapper[5004]: I1203 14:23:53.596715 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-79c58f7d4-4qmpw" Dec 03 14:23:54 crc kubenswrapper[5004]: I1203 14:23:54.349097 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6" exitCode=0 Dec 03 14:23:54 crc kubenswrapper[5004]: I1203 14:23:54.349175 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6"} Dec 03 14:23:54 crc kubenswrapper[5004]: I1203 14:23:54.349439 5004 scope.go:117] "RemoveContainer" containerID="9bd02c4c0d0b111db1f15844825941f9f11df38510f443008de55f9cd8344d21" Dec 03 14:24:03 crc kubenswrapper[5004]: E1203 14:24:03.931106 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 03 14:24:03 crc kubenswrapper[5004]: E1203 14:24:03.932209 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n75wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-td552_openstack-operators(2aab9a50-58d3-4eba-8589-c009d3b2b604): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:24:03 crc kubenswrapper[5004]: I1203 14:24:03.945501 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:24:04 crc kubenswrapper[5004]: E1203 14:24:04.480952 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 14:24:04 crc kubenswrapper[5004]: E1203 14:24:04.481133 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m2xds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-4ck5g_openstack-operators(b70998ef-a4ea-49a9-922d-d7ad70346932): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 14:24:04 crc kubenswrapper[5004]: E1203 14:24:04.483146 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" podUID="b70998ef-a4ea-49a9-922d-d7ad70346932" Dec 03 14:24:05 crc kubenswrapper[5004]: I1203 14:24:05.468926 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a"} Dec 03 14:24:05 crc kubenswrapper[5004]: I1203 14:24:05.470017 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:24:05 crc kubenswrapper[5004]: E1203 14:24:05.472977 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" podUID="b70998ef-a4ea-49a9-922d-d7ad70346932" Dec 03 14:24:05 crc kubenswrapper[5004]: I1203 14:24:05.476021 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.266394 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.266566 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59ptr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-st4w4_openstack-operators(dd7dec16-458d-46f6-9ee6-b0db6551792a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.267834 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" podUID="dd7dec16-458d-46f6-9ee6-b0db6551792a" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.394192 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.394743 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vq8dh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-97hkh_openstack-operators(e44a1a8b-fd83-478f-9095-73e2f82ed81c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.395929 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" podUID="e44a1a8b-fd83-478f-9095-73e2f82ed81c" Dec 03 14:24:06 crc kubenswrapper[5004]: I1203 14:24:06.485538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" event={"ID":"bf9d689f-bfab-4b05-9b08-d855836a7846","Type":"ContainerStarted","Data":"81dd32d29d6eb92c23d32d0d259b01f238af31f7dd7739b8c006207b9e651678"} Dec 03 14:24:06 crc kubenswrapper[5004]: I1203 14:24:06.487094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" event={"ID":"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b","Type":"ContainerStarted","Data":"1af841b1ac173038e6611661f7de74ad1573369a2f0b453872e012f4604737c5"} Dec 03 14:24:06 crc kubenswrapper[5004]: I1203 14:24:06.487896 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:24:06 crc kubenswrapper[5004]: I1203 14:24:06.490185 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" Dec 03 14:24:06 crc kubenswrapper[5004]: E1203 14:24:06.726173 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" podUID="a38ab130-8698-49c3-bf30-355f88bcdc45" Dec 03 14:24:07 crc kubenswrapper[5004]: E1203 14:24:07.227328 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" podUID="04d75592-adf5-42b6-a02e-0074674b393d" Dec 03 14:24:07 crc kubenswrapper[5004]: E1203 14:24:07.234349 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" podUID="271911f5-3a7c-448b-976d-268c5b19edc1" Dec 03 14:24:07 crc kubenswrapper[5004]: E1203 14:24:07.237492 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:24:07 crc kubenswrapper[5004]: E1203 14:24:07.328626 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" podUID="3c18cd5e-8d20-4a2b-a62c-d141de1fc38a" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.495186 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" event={"ID":"04d75592-adf5-42b6-a02e-0074674b393d","Type":"ContainerStarted","Data":"0aa38eb8d96263236f8df796aebf8fe6b13e4c026492a31403c9ff5deee5bf8c"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.496966 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" event={"ID":"206c7f05-3575-400e-a37b-ba608f159fc5","Type":"ContainerStarted","Data":"d7657407e930498a40df1fc4f5d250c734e76e5f279dc986b2f0b7ba3e7718a4"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.498779 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" event={"ID":"9be3a985-7677-4334-b270-386feb954a5c","Type":"ContainerStarted","Data":"b3f4756a0179a15fd9d07a96bded5134a4a56a82417c72a576ea3472a0d7735c"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.499242 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.503366 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" event={"ID":"bf9d689f-bfab-4b05-9b08-d855836a7846","Type":"ContainerStarted","Data":"37b02e29a5988d6ccd6705ba5046999bd02be8097682b2e371fe6533ec07f34d"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.503566 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.504787 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.505397 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" event={"ID":"271911f5-3a7c-448b-976d-268c5b19edc1","Type":"ContainerStarted","Data":"a3722463067752c96d39621dd6843a5f7f879b1732bcff35cce3d537a8e60b77"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.511634 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" event={"ID":"f35c5faa-53cc-4829-91a0-1c422eae75f6","Type":"ContainerStarted","Data":"9970ab7af71f0379730e2651f559129199f50f1301cb4a87191567e9af38e6bc"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.511831 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.513359 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.514115 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" event={"ID":"3741a6af-989d-47ac-a6ee-a6443a4f2883","Type":"ContainerStarted","Data":"da6c7e526d22b1ce683aa5e3de55aa1c4667c194c7bb2937d6e7068ba511959c"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.515404 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" event={"ID":"65687c7c-1b6d-485f-b99c-41706846c7a7","Type":"ContainerStarted","Data":"a1a27fea86c3d4aaafcdd7b3390ca7ad1bafd60b2d6e86ad4bb762b1a3b0dfec"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.519212 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" event={"ID":"419e5e47-1866-473a-a668-2fee54cb76ce","Type":"ContainerStarted","Data":"8e7011fa5760bf6d871b51a2a14faae3eabe3fbc304467ba8a4e8aa9bb172575"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.519418 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.521568 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.522997 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" event={"ID":"7dbdc2c5-5e0c-4315-b836-1acacf93df2d","Type":"ContainerStarted","Data":"beb3fd4fc94b796ca27bed1b75d47c145cdc29fd6484be1dacb58e9bfb6452fd"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.533911 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" event={"ID":"b70998ef-a4ea-49a9-922d-d7ad70346932","Type":"ContainerStarted","Data":"8f988ef1f41b6bff23daf07288ab70f0224a3f5c40f796007592723a06a4592e"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.542829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" event={"ID":"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a","Type":"ContainerStarted","Data":"277ec53dfc4670cf8115ef8aa1194581fdf9824b39e35e95c872b8e8a9509114"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.553731 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gzd52" podStartSLOduration=5.02988277 podStartE2EDuration="56.553706283s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.960243735 +0000 UTC m=+1005.709213971" lastFinishedPulling="2025-12-03 14:24:04.484067248 +0000 UTC m=+1057.233037484" observedRunningTime="2025-12-03 14:24:07.552951022 +0000 UTC m=+1060.301921258" watchObservedRunningTime="2025-12-03 14:24:07.553706283 +0000 UTC m=+1060.302676509" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.558688 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" event={"ID":"dd7dec16-458d-46f6-9ee6-b0db6551792a","Type":"ContainerStarted","Data":"8766349dc4c34b1eacb84e57aa533edb15fd8f32f360bdd2b009fcd37ebbc894"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.582131 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" event={"ID":"2aab9a50-58d3-4eba-8589-c009d3b2b604","Type":"ContainerStarted","Data":"e7d1d928b3aad06245cad8e84170f8130180e9326b55bcbb17cac102d1870deb"} Dec 03 14:24:07 crc kubenswrapper[5004]: E1203 14:24:07.583275 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.597910 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" event={"ID":"298e9f66-a005-42bd-b2f6-4653a88e0177","Type":"ContainerStarted","Data":"e1624b322d0e0bb1cbc3053336393f818b8616b062fdd3fcdc6874c7c65f970f"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.599170 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.608660 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.640777 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" event={"ID":"f10a5021-1caf-47ba-8dce-51021a641f4c","Type":"ContainerStarted","Data":"84b471f421edc6e53b7f1bf7e6490a0a39601b00ac4c5faa589bdbb4ca616666"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.640817 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.640874 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.648326 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" event={"ID":"a1d5cb2a-85a6-4ff0-a9cf-519397479d2c","Type":"ContainerStarted","Data":"339745c555f9bf0d42fca01c6e95466ae61b07863620b589f149e7e6d1b1d170"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.648659 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.657444 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.664660 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-6fqgd" podStartSLOduration=3.85648833 podStartE2EDuration="57.664642228s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.313323955 +0000 UTC m=+1005.062294191" lastFinishedPulling="2025-12-03 14:24:06.121477853 +0000 UTC m=+1058.870448089" observedRunningTime="2025-12-03 14:24:07.632318423 +0000 UTC m=+1060.381288659" watchObservedRunningTime="2025-12-03 14:24:07.664642228 +0000 UTC m=+1060.413612464" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.670566 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" event={"ID":"9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b","Type":"ContainerStarted","Data":"614121d0f266a4a502989f5a8d25966d7231bc50517d4f632de2a23b4c44b0a4"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.671178 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.691295 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-4b4kl" podStartSLOduration=4.290190116 podStartE2EDuration="57.691280571s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.705531913 +0000 UTC m=+1005.454502149" lastFinishedPulling="2025-12-03 14:24:06.106622358 +0000 UTC m=+1058.855592604" observedRunningTime="2025-12-03 14:24:07.690314363 +0000 UTC m=+1060.439284599" watchObservedRunningTime="2025-12-03 14:24:07.691280571 +0000 UTC m=+1060.440250807" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.695041 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" podStartSLOduration=4.744079597 podStartE2EDuration="56.695035688s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:13.005934203 +0000 UTC m=+1005.754904439" lastFinishedPulling="2025-12-03 14:24:04.956890294 +0000 UTC m=+1057.705860530" observedRunningTime="2025-12-03 14:24:07.660017636 +0000 UTC m=+1060.408987872" watchObservedRunningTime="2025-12-03 14:24:07.695035688 +0000 UTC m=+1060.444005924" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.699270 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" event={"ID":"90f6b1a6-2cd1-4649-b794-e00f64cd80cb","Type":"ContainerStarted","Data":"8b9bdcf9ea9185971c07d31d11af82cd50a48bea6b62b6467e861933891caf70"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.700268 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.705693 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.717794 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" event={"ID":"a38ab130-8698-49c3-bf30-355f88bcdc45","Type":"ContainerStarted","Data":"057f88bc423bc66b73402b6dccbfd2dc5f09e5eac598c9e39fa4e50c7710a304"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.738535 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-thtjj" podStartSLOduration=3.676032405 podStartE2EDuration="57.738519883s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.312111941 +0000 UTC m=+1005.061082177" lastFinishedPulling="2025-12-03 14:24:06.374599419 +0000 UTC m=+1059.123569655" observedRunningTime="2025-12-03 14:24:07.734063176 +0000 UTC m=+1060.483033412" watchObservedRunningTime="2025-12-03 14:24:07.738519883 +0000 UTC m=+1060.487490119" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.782799 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" event={"ID":"34247b31-24ab-4386-8bf1-f0bfa7df6f00","Type":"ContainerStarted","Data":"0b6692b56a878ea3462bbfe2b82c971eb9fe016c371eabd4b8958a909bd602d5"} Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.816803 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gndmp" podStartSLOduration=3.36820053 podStartE2EDuration="56.816789714s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.652018151 +0000 UTC m=+1005.400988387" lastFinishedPulling="2025-12-03 14:24:06.100607335 +0000 UTC m=+1058.849577571" observedRunningTime="2025-12-03 14:24:07.81629815 +0000 UTC m=+1060.565268406" watchObservedRunningTime="2025-12-03 14:24:07.816789714 +0000 UTC m=+1060.565759950" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.821752 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-rlxc5" podStartSLOduration=3.666248933 podStartE2EDuration="56.821739806s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.945004379 +0000 UTC m=+1005.693974615" lastFinishedPulling="2025-12-03 14:24:06.100495252 +0000 UTC m=+1058.849465488" observedRunningTime="2025-12-03 14:24:07.783693376 +0000 UTC m=+1060.532663612" watchObservedRunningTime="2025-12-03 14:24:07.821739806 +0000 UTC m=+1060.570710042" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.856242 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" podStartSLOduration=5.408349604 podStartE2EDuration="56.856215623s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:13.036265271 +0000 UTC m=+1005.785235507" lastFinishedPulling="2025-12-03 14:24:04.48413128 +0000 UTC m=+1057.233101526" observedRunningTime="2025-12-03 14:24:07.855801491 +0000 UTC m=+1060.604771727" watchObservedRunningTime="2025-12-03 14:24:07.856215623 +0000 UTC m=+1060.605185859" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.893375 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-lprd2" podStartSLOduration=4.102471873 podStartE2EDuration="57.893358226s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.330413455 +0000 UTC m=+1005.079383701" lastFinishedPulling="2025-12-03 14:24:06.121299808 +0000 UTC m=+1058.870270054" observedRunningTime="2025-12-03 14:24:07.892212083 +0000 UTC m=+1060.641182319" watchObservedRunningTime="2025-12-03 14:24:07.893358226 +0000 UTC m=+1060.642328462" Dec 03 14:24:07 crc kubenswrapper[5004]: I1203 14:24:07.937023 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4ck5g" podStartSLOduration=26.640922702 podStartE2EDuration="57.937004985s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.43470657 +0000 UTC m=+1005.183676806" lastFinishedPulling="2025-12-03 14:23:43.730788863 +0000 UTC m=+1036.479759089" observedRunningTime="2025-12-03 14:24:07.936064298 +0000 UTC m=+1060.685034534" watchObservedRunningTime="2025-12-03 14:24:07.937004985 +0000 UTC m=+1060.685975211" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.007412 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-st4w4" podStartSLOduration=27.001405912 podStartE2EDuration="58.00739308s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.704437342 +0000 UTC m=+1005.453407578" lastFinishedPulling="2025-12-03 14:23:43.71042451 +0000 UTC m=+1036.459394746" observedRunningTime="2025-12-03 14:24:08.004471517 +0000 UTC m=+1060.753441753" watchObservedRunningTime="2025-12-03 14:24:08.00739308 +0000 UTC m=+1060.756363316" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.098836 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-5z95c" podStartSLOduration=3.69762988 podStartE2EDuration="57.098814397s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.69737799 +0000 UTC m=+1005.446348216" lastFinishedPulling="2025-12-03 14:24:06.098562497 +0000 UTC m=+1058.847532733" observedRunningTime="2025-12-03 14:24:08.036824283 +0000 UTC m=+1060.785794529" watchObservedRunningTime="2025-12-03 14:24:08.098814397 +0000 UTC m=+1060.847784633" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.238229 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-l4b9j" podStartSLOduration=3.729703671 podStartE2EDuration="58.238206658s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:11.823603676 +0000 UTC m=+1004.572573912" lastFinishedPulling="2025-12-03 14:24:06.332106663 +0000 UTC m=+1059.081076899" observedRunningTime="2025-12-03 14:24:08.193637172 +0000 UTC m=+1060.942607408" watchObservedRunningTime="2025-12-03 14:24:08.238206658 +0000 UTC m=+1060.987176894" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.790414 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" event={"ID":"3c18cd5e-8d20-4a2b-a62c-d141de1fc38a","Type":"ContainerStarted","Data":"27959182ed47311e437fd0e4d2369a0fea6a88a492ea8de9d4d75a2b69ec5a44"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.791903 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.792430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" event={"ID":"a38ab130-8698-49c3-bf30-355f88bcdc45","Type":"ContainerStarted","Data":"3f5404ac4fab0aef8d3b47a987fd87ea2c5983fe9861d0c47cb04371d8d805f1"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.792797 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.799768 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" event={"ID":"271911f5-3a7c-448b-976d-268c5b19edc1","Type":"ContainerStarted","Data":"ab08a3e790ba965d6ef454a6a44bf1d94c9eebf01a8d086e224f51a30e09f10f"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.800518 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.802263 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" event={"ID":"e44a1a8b-fd83-478f-9095-73e2f82ed81c","Type":"ContainerStarted","Data":"fec08c8695e77967fa6189f1d296c6b96a0dc73462549b395ed55f2c14d0776c"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.802986 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.804970 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.805036 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" event={"ID":"04d75592-adf5-42b6-a02e-0074674b393d","Type":"ContainerStarted","Data":"dfc79dc6b2f878db080ad031a4345ba94a2a5dc2229ad4391cf9244624e85f32"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.805126 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.806641 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" event={"ID":"3741a6af-989d-47ac-a6ee-a6443a4f2883","Type":"ContainerStarted","Data":"44b51883a7e8f3cf3bafed7f2149f6d9c3afe381791894fa40160ad24541540b"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.806754 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.808517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" event={"ID":"206c7f05-3575-400e-a37b-ba608f159fc5","Type":"ContainerStarted","Data":"22be5990b3fefe6fd995e871bf42f8963b38034b17552e45f900d0b1d10cf2ba"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.809043 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.811419 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" event={"ID":"7dbdc2c5-5e0c-4315-b836-1acacf93df2d","Type":"ContainerStarted","Data":"259a1d7131e8df3774dee07e89a96f8b8bd7e737ed5758cb13b0e5be13ec7cf5"} Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.811450 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.834616 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" podStartSLOduration=2.729704945 podStartE2EDuration="58.834594441s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.312340197 +0000 UTC m=+1005.061310423" lastFinishedPulling="2025-12-03 14:24:08.417229683 +0000 UTC m=+1061.166199919" observedRunningTime="2025-12-03 14:24:08.819081057 +0000 UTC m=+1061.568051293" watchObservedRunningTime="2025-12-03 14:24:08.834594441 +0000 UTC m=+1061.583564667" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.855966 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" podStartSLOduration=2.92795751 podStartE2EDuration="58.855947612s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.091046262 +0000 UTC m=+1004.840016508" lastFinishedPulling="2025-12-03 14:24:08.019036374 +0000 UTC m=+1060.768006610" observedRunningTime="2025-12-03 14:24:08.853347328 +0000 UTC m=+1061.602317564" watchObservedRunningTime="2025-12-03 14:24:08.855947612 +0000 UTC m=+1061.604917848" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.872550 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" podStartSLOduration=2.8149923770000003 podStartE2EDuration="58.872534487s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.091608738 +0000 UTC m=+1004.840578974" lastFinishedPulling="2025-12-03 14:24:08.149150848 +0000 UTC m=+1060.898121084" observedRunningTime="2025-12-03 14:24:08.867307987 +0000 UTC m=+1061.616278223" watchObservedRunningTime="2025-12-03 14:24:08.872534487 +0000 UTC m=+1061.621504723" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.891928 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" podStartSLOduration=5.9217109709999995 podStartE2EDuration="57.891908012s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.986644141 +0000 UTC m=+1005.735614377" lastFinishedPulling="2025-12-03 14:24:04.956841192 +0000 UTC m=+1057.705811418" observedRunningTime="2025-12-03 14:24:08.887190837 +0000 UTC m=+1061.636161073" watchObservedRunningTime="2025-12-03 14:24:08.891908012 +0000 UTC m=+1061.640878248" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.928265 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" podStartSLOduration=37.507348842 podStartE2EDuration="57.928241542s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:44.063160258 +0000 UTC m=+1036.812130494" lastFinishedPulling="2025-12-03 14:24:04.484052968 +0000 UTC m=+1057.233023194" observedRunningTime="2025-12-03 14:24:08.920555542 +0000 UTC m=+1061.669525798" watchObservedRunningTime="2025-12-03 14:24:08.928241542 +0000 UTC m=+1061.677211778" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.950378 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" podStartSLOduration=38.279479289 podStartE2EDuration="58.950362135s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:44.286135961 +0000 UTC m=+1037.035106197" lastFinishedPulling="2025-12-03 14:24:04.957018807 +0000 UTC m=+1057.705989043" observedRunningTime="2025-12-03 14:24:08.949054817 +0000 UTC m=+1061.698025043" watchObservedRunningTime="2025-12-03 14:24:08.950362135 +0000 UTC m=+1061.699332371" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.984551 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-97hkh" podStartSLOduration=27.326072986 podStartE2EDuration="58.984534373s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.086917884 +0000 UTC m=+1004.835888120" lastFinishedPulling="2025-12-03 14:23:43.745379271 +0000 UTC m=+1036.494349507" observedRunningTime="2025-12-03 14:24:08.982747412 +0000 UTC m=+1061.731717658" watchObservedRunningTime="2025-12-03 14:24:08.984534373 +0000 UTC m=+1061.733504609" Dec 03 14:24:08 crc kubenswrapper[5004]: I1203 14:24:08.987031 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" podStartSLOduration=3.250223696 podStartE2EDuration="58.987022784s" podCreationTimestamp="2025-12-03 14:23:10 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.680434635 +0000 UTC m=+1005.429404871" lastFinishedPulling="2025-12-03 14:24:08.417233733 +0000 UTC m=+1061.166203959" observedRunningTime="2025-12-03 14:24:08.97148475 +0000 UTC m=+1061.720454996" watchObservedRunningTime="2025-12-03 14:24:08.987022784 +0000 UTC m=+1061.735993020" Dec 03 14:24:11 crc kubenswrapper[5004]: I1203 14:24:11.927226 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-8xwrm" Dec 03 14:24:11 crc kubenswrapper[5004]: I1203 14:24:11.928407 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-92pmx" Dec 03 14:24:11 crc kubenswrapper[5004]: I1203 14:24:11.964332 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8bnn2" Dec 03 14:24:17 crc kubenswrapper[5004]: I1203 14:24:17.022954 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pv9cw" Dec 03 14:24:17 crc kubenswrapper[5004]: I1203 14:24:17.395370 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd46zspw" Dec 03 14:24:19 crc kubenswrapper[5004]: E1203 14:24:19.617130 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podUID="2aab9a50-58d3-4eba-8589-c009d3b2b604" Dec 03 14:24:21 crc kubenswrapper[5004]: I1203 14:24:21.112046 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qffz8" Dec 03 14:24:21 crc kubenswrapper[5004]: I1203 14:24:21.216471 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4lhqd" Dec 03 14:24:21 crc kubenswrapper[5004]: I1203 14:24:21.486189 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-nwtth" Dec 03 14:24:21 crc kubenswrapper[5004]: I1203 14:24:21.553514 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-kffh7" Dec 03 14:24:34 crc kubenswrapper[5004]: I1203 14:24:34.014135 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" event={"ID":"2aab9a50-58d3-4eba-8589-c009d3b2b604","Type":"ContainerStarted","Data":"770efe4ddc6f980dd8783528f0d5dc7f0f791a416608fcb3cd4250fa6817b456"} Dec 03 14:24:34 crc kubenswrapper[5004]: I1203 14:24:34.016426 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:24:34 crc kubenswrapper[5004]: I1203 14:24:34.047544 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" podStartSLOduration=2.976054255 podStartE2EDuration="1m23.047513853s" podCreationTimestamp="2025-12-03 14:23:11 +0000 UTC" firstStartedPulling="2025-12-03 14:23:12.839165879 +0000 UTC m=+1005.588136125" lastFinishedPulling="2025-12-03 14:24:32.910625497 +0000 UTC m=+1085.659595723" observedRunningTime="2025-12-03 14:24:34.038933977 +0000 UTC m=+1086.787904233" watchObservedRunningTime="2025-12-03 14:24:34.047513853 +0000 UTC m=+1086.796484119" Dec 03 14:24:41 crc kubenswrapper[5004]: I1203 14:24:41.755637 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-td552" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.941229 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.943618 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.947154 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-mw8pm" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.947230 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.947595 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.949524 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 14:24:56 crc kubenswrapper[5004]: I1203 14:24:56.958510 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.002482 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.002569 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s88cc\" (UniqueName: \"kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.038510 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.039965 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.044770 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.057548 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.103741 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.103884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.103912 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.104209 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s88cc\" (UniqueName: \"kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.104287 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8ln\" (UniqueName: \"kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.105634 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.123252 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s88cc\" (UniqueName: \"kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc\") pod \"dnsmasq-dns-675f4bcbfc-jh8vm\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.207332 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8ln\" (UniqueName: \"kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.207803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.207849 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.208969 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.209702 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.225784 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8ln\" (UniqueName: \"kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln\") pod \"dnsmasq-dns-78dd6ddcc-9srd9\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.263814 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.362511 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.728640 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:24:57 crc kubenswrapper[5004]: I1203 14:24:57.820879 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:24:58 crc kubenswrapper[5004]: I1203 14:24:58.183064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" event={"ID":"17736995-f3a0-4c44-b71a-db92f1371baa","Type":"ContainerStarted","Data":"f66381006d46adbb80083f04f74179d1dbd6cb194260d02a79acdaed80d07416"} Dec 03 14:24:58 crc kubenswrapper[5004]: I1203 14:24:58.184518 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" event={"ID":"d63087ad-f25b-4292-ba82-d7df8e313480","Type":"ContainerStarted","Data":"6359c9dfe8af5901565cb839295c040bcf57631d19462b51a783205899be1825"} Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.116430 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.133416 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.135365 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.151403 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.170102 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.170180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.170200 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k7w4\" (UniqueName: \"kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.272668 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k7w4\" (UniqueName: \"kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.272769 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.272835 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.273842 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.273846 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.332217 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k7w4\" (UniqueName: \"kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4\") pod \"dnsmasq-dns-666b6646f7-dd59j\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.419577 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.441598 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.444004 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.456974 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.465145 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.578567 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.578630 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.578831 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrfd\" (UniqueName: \"kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.680907 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.680957 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrfd\" (UniqueName: \"kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.681256 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.682685 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.685544 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.698367 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrfd\" (UniqueName: \"kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd\") pod \"dnsmasq-dns-57d769cc4f-72mlm\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:00 crc kubenswrapper[5004]: I1203 14:25:00.802025 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.069702 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:01 crc kubenswrapper[5004]: W1203 14:25:01.087046 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c0fb8a_87cc_4b30_894f_a6d6b180a636.slice/crio-dc90469645a82aea019c0671edada83091de7c1a32deec206255f3bc072babd3 WatchSource:0}: Error finding container dc90469645a82aea019c0671edada83091de7c1a32deec206255f3bc072babd3: Status 404 returned error can't find the container with id dc90469645a82aea019c0671edada83091de7c1a32deec206255f3bc072babd3 Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.222083 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" event={"ID":"c3c0fb8a-87cc-4b30-894f-a6d6b180a636","Type":"ContainerStarted","Data":"dc90469645a82aea019c0671edada83091de7c1a32deec206255f3bc072babd3"} Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.287026 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:25:01 crc kubenswrapper[5004]: W1203 14:25:01.288970 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ddfa3a_a5ef_40f7_8920_ba29e6ca5b1a.slice/crio-719259b9781b7c53304d9b921f84d11733a77b9e2420476d8ab1c01ef6f637f1 WatchSource:0}: Error finding container 719259b9781b7c53304d9b921f84d11733a77b9e2420476d8ab1c01ef6f637f1: Status 404 returned error can't find the container with id 719259b9781b7c53304d9b921f84d11733a77b9e2420476d8ab1c01ef6f637f1 Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.297583 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.301290 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.305702 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.307754 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.312075 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g9m58" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.312293 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.313468 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.313670 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.313790 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.313827 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.390911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.390962 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.390984 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391000 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hw7g\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391051 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391074 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391157 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391202 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391219 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.391568 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.392166 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.493794 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.493915 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.493952 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.493967 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.493989 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494007 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hw7g\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494077 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494099 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.494141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.495302 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.495563 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.495891 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.495943 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.498504 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.500024 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.502346 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.502356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.502417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.524652 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.529465 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hw7g\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.531928 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.607239 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.608459 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.613215 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.613242 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jt4gj" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.613380 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.613486 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.617668 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.617821 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.623563 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.624833 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.664234 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697653 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697723 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697781 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697802 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697837 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697905 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.697983 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.698018 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.698091 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.698114 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jw4x\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.698143 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.803803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.803886 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.803930 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.803964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804003 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804105 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804163 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804188 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jw4x\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804249 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.804312 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.805620 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.805850 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.806149 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.809926 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.810559 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.810587 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.810680 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.811088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.811951 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.825672 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.830625 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jw4x\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.850455 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:01 crc kubenswrapper[5004]: I1203 14:25:01.996929 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:02 crc kubenswrapper[5004]: I1203 14:25:02.244642 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" event={"ID":"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a","Type":"ContainerStarted","Data":"719259b9781b7c53304d9b921f84d11733a77b9e2420476d8ab1c01ef6f637f1"} Dec 03 14:25:02 crc kubenswrapper[5004]: I1203 14:25:02.372057 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:25:02 crc kubenswrapper[5004]: W1203 14:25:02.388321 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffbbacf9_4c9b_47ac_9ff7_76bee9534490.slice/crio-b7a6b3cf9067dbcadb7a7672a3086ebac3fcab41c0a098d2f31a2af239b71e29 WatchSource:0}: Error finding container b7a6b3cf9067dbcadb7a7672a3086ebac3fcab41c0a098d2f31a2af239b71e29: Status 404 returned error can't find the container with id b7a6b3cf9067dbcadb7a7672a3086ebac3fcab41c0a098d2f31a2af239b71e29 Dec 03 14:25:02 crc kubenswrapper[5004]: I1203 14:25:02.648645 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.177632 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.182224 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.190991 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-z6pfz" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.197099 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.205891 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.206453 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.235766 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.253723 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.259183 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerStarted","Data":"b7a6b3cf9067dbcadb7a7672a3086ebac3fcab41c0a098d2f31a2af239b71e29"} Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347799 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347880 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347917 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347939 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.347988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb7z9\" (UniqueName: \"kubernetes.io/projected/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kube-api-access-xb7z9\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.348008 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.348022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.449762 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.449884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.449922 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.449990 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.450054 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.450072 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb7z9\" (UniqueName: \"kubernetes.io/projected/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kube-api-access-xb7z9\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.450098 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.450137 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.450731 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.451085 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.451379 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.452047 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.454092 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.464369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.464369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.472982 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb7z9\" (UniqueName: \"kubernetes.io/projected/affd9c16-d0c4-4c54-b438-bdb3a4cafdd8-kube-api-access-xb7z9\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.493164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8\") " pod="openstack/openstack-galera-0" Dec 03 14:25:03 crc kubenswrapper[5004]: I1203 14:25:03.549133 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.386572 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.390702 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.394460 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.394909 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.395119 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.396300 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5b9gz" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.398144 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572609 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572693 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572717 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxl78\" (UniqueName: \"kubernetes.io/projected/83c3d56e-3bcd-407c-97e1-113485660567-kube-api-access-sxl78\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572737 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572766 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572808 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572837 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.572939 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/83c3d56e-3bcd-407c-97e1-113485660567-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.617627 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.618719 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.622086 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-p2bqq" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.622327 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.622587 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.642301 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.675954 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676038 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676068 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676091 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/83c3d56e-3bcd-407c-97e1-113485660567-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676188 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676256 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxl78\" (UniqueName: \"kubernetes.io/projected/83c3d56e-3bcd-407c-97e1-113485660567-kube-api-access-sxl78\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.676285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.677644 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/83c3d56e-3bcd-407c-97e1-113485660567-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.678308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.678372 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.680268 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.681217 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/83c3d56e-3bcd-407c-97e1-113485660567-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.683663 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.697246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c3d56e-3bcd-407c-97e1-113485660567-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.704541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxl78\" (UniqueName: \"kubernetes.io/projected/83c3d56e-3bcd-407c-97e1-113485660567-kube-api-access-sxl78\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.744552 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"83c3d56e-3bcd-407c-97e1-113485660567\") " pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.780452 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.780816 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.780926 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bws8z\" (UniqueName: \"kubernetes.io/projected/3bfe51e0-df6b-446f-9647-d9165f3cdead-kube-api-access-bws8z\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.783190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-config-data\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.783265 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-kolla-config\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.884802 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.884902 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.884944 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bws8z\" (UniqueName: \"kubernetes.io/projected/3bfe51e0-df6b-446f-9647-d9165f3cdead-kube-api-access-bws8z\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.884973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-config-data\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.885019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-kolla-config\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.886375 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-config-data\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.886597 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3bfe51e0-df6b-446f-9647-d9165f3cdead-kolla-config\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.888851 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.889909 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfe51e0-df6b-446f-9647-d9165f3cdead-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.915441 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bws8z\" (UniqueName: \"kubernetes.io/projected/3bfe51e0-df6b-446f-9647-d9165f3cdead-kube-api-access-bws8z\") pod \"memcached-0\" (UID: \"3bfe51e0-df6b-446f-9647-d9165f3cdead\") " pod="openstack/memcached-0" Dec 03 14:25:04 crc kubenswrapper[5004]: I1203 14:25:04.959360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 14:25:05 crc kubenswrapper[5004]: I1203 14:25:05.033224 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.454614 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.456202 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.464677 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-g48z5" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.468946 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.622347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcjt\" (UniqueName: \"kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt\") pod \"kube-state-metrics-0\" (UID: \"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd\") " pod="openstack/kube-state-metrics-0" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.723796 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcjt\" (UniqueName: \"kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt\") pod \"kube-state-metrics-0\" (UID: \"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd\") " pod="openstack/kube-state-metrics-0" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.743433 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcjt\" (UniqueName: \"kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt\") pod \"kube-state-metrics-0\" (UID: \"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd\") " pod="openstack/kube-state-metrics-0" Dec 03 14:25:06 crc kubenswrapper[5004]: I1203 14:25:06.772289 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.071175 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zdz2r"] Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.073517 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.077003 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rhs89" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.077531 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.077663 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.095096 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-65kf4"] Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.096917 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.127701 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zdz2r"] Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.141929 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-65kf4"] Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182583 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182649 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-scripts\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182689 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-log-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182719 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czhwh\" (UniqueName: \"kubernetes.io/projected/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-kube-api-access-czhwh\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182760 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-ovn-controller-tls-certs\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182775 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.182806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-combined-ca-bundle\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284236 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-ovn-controller-tls-certs\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284328 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-etc-ovs\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284357 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-combined-ca-bundle\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284384 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8gph\" (UniqueName: \"kubernetes.io/projected/038a0c7b-ce5f-481a-b716-e6b5f3077655-kube-api-access-p8gph\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284762 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/038a0c7b-ce5f-481a-b716-e6b5f3077655-scripts\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284814 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-run\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284881 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-scripts\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.284985 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.285002 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-log-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.285276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-run\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.285511 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-var-log-ovn\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.285921 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-lib\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.285995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czhwh\" (UniqueName: \"kubernetes.io/projected/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-kube-api-access-czhwh\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.286080 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-log\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.288433 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-scripts\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.299246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-ovn-controller-tls-certs\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.300251 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-combined-ca-bundle\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.302406 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czhwh\" (UniqueName: \"kubernetes.io/projected/9cf66a90-3f7d-4170-8dab-9ff58ba576a3-kube-api-access-czhwh\") pod \"ovn-controller-zdz2r\" (UID: \"9cf66a90-3f7d-4170-8dab-9ff58ba576a3\") " pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390111 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-etc-ovs\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390238 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8gph\" (UniqueName: \"kubernetes.io/projected/038a0c7b-ce5f-481a-b716-e6b5f3077655-kube-api-access-p8gph\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390313 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/038a0c7b-ce5f-481a-b716-e6b5f3077655-scripts\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390344 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-run\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390470 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-lib\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-log\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390607 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-etc-ovs\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390699 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-run\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.390890 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-log\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.391130 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/038a0c7b-ce5f-481a-b716-e6b5f3077655-var-lib\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.393196 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/038a0c7b-ce5f-481a-b716-e6b5f3077655-scripts\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.400102 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.407627 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8gph\" (UniqueName: \"kubernetes.io/projected/038a0c7b-ce5f-481a-b716-e6b5f3077655-kube-api-access-p8gph\") pod \"ovn-controller-ovs-65kf4\" (UID: \"038a0c7b-ce5f-481a-b716-e6b5f3077655\") " pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.418903 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.940309 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.941938 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.945258 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.945278 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.946311 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.946561 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.948232 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-thjc5" Dec 03 14:25:10 crc kubenswrapper[5004]: I1203 14:25:10.954689 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.102803 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.102874 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.102911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.102959 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.103001 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdwz9\" (UniqueName: \"kubernetes.io/projected/116764ff-36a4-444f-8051-e93b94a548fd-kube-api-access-sdwz9\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.103044 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.103086 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.103170 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/116764ff-36a4-444f-8051-e93b94a548fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.205800 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/116764ff-36a4-444f-8051-e93b94a548fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.205903 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.205937 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.205974 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.206031 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.206082 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdwz9\" (UniqueName: \"kubernetes.io/projected/116764ff-36a4-444f-8051-e93b94a548fd-kube-api-access-sdwz9\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.206136 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.206177 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.206742 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.210173 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.208101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/116764ff-36a4-444f-8051-e93b94a548fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.212972 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116764ff-36a4-444f-8051-e93b94a548fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.224543 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.224542 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.225497 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116764ff-36a4-444f-8051-e93b94a548fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.232969 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdwz9\" (UniqueName: \"kubernetes.io/projected/116764ff-36a4-444f-8051-e93b94a548fd-kube-api-access-sdwz9\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.240781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"116764ff-36a4-444f-8051-e93b94a548fd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.282757 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:11 crc kubenswrapper[5004]: I1203 14:25:11.340465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerStarted","Data":"0acf4a4af55697106abbdb02a68ba69aa50b535a573ae1d6bf621b1bd20e69a7"} Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.019695 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.022096 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.030607 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.031014 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zjzjn" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.031059 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.031113 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.040554 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156451 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156507 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156554 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfr2t\" (UniqueName: \"kubernetes.io/projected/4d5df9d0-5ee6-4981-86b4-e90415206ceb-kube-api-access-kfr2t\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156589 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156613 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.156634 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258011 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258406 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258696 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258747 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258766 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.258804 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfr2t\" (UniqueName: \"kubernetes.io/projected/4d5df9d0-5ee6-4981-86b4-e90415206ceb-kube-api-access-kfr2t\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.259135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.259135 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.260565 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-config\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.260854 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d5df9d0-5ee6-4981-86b4-e90415206ceb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.265510 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.265580 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.267832 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d5df9d0-5ee6-4981-86b4-e90415206ceb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.280103 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfr2t\" (UniqueName: \"kubernetes.io/projected/4d5df9d0-5ee6-4981-86b4-e90415206ceb-kube-api-access-kfr2t\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.290207 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4d5df9d0-5ee6-4981-86b4-e90415206ceb\") " pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:14 crc kubenswrapper[5004]: I1203 14:25:14.349729 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:17 crc kubenswrapper[5004]: I1203 14:25:17.489448 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 14:25:21 crc kubenswrapper[5004]: I1203 14:25:21.420589 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8","Type":"ContainerStarted","Data":"843acbb3ba840b17ef638e4ef21874aa9d696291f66e501f493cc213309e6b7e"} Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.591129 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.591279 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9k7w4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-dd59j_openstack(c3c0fb8a-87cc-4b30-894f-a6d6b180a636): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.592943 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" podUID="c3c0fb8a-87cc-4b30-894f-a6d6b180a636" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.594046 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.594221 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvrfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-72mlm_openstack(38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.595565 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" podUID="38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.605012 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.605160 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s88cc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jh8vm_openstack(17736995-f3a0-4c44-b71a-db92f1371baa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.607138 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" podUID="17736995-f3a0-4c44-b71a-db92f1371baa" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.612435 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.612666 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xf8ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9srd9_openstack(d63087ad-f25b-4292-ba82-d7df8e313480): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:25:21 crc kubenswrapper[5004]: E1203 14:25:21.614112 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" podUID="d63087ad-f25b-4292-ba82-d7df8e313480" Dec 03 14:25:22 crc kubenswrapper[5004]: E1203 14:25:22.428746 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" podUID="c3c0fb8a-87cc-4b30-894f-a6d6b180a636" Dec 03 14:25:22 crc kubenswrapper[5004]: E1203 14:25:22.428746 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" podUID="38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" Dec 03 14:25:22 crc kubenswrapper[5004]: E1203 14:25:22.574131 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 14:25:22 crc kubenswrapper[5004]: E1203 14:25:22.574320 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(ffbbacf9-4c9b-47ac-9ff7-76bee9534490): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:25:22 crc kubenswrapper[5004]: E1203 14:25:22.575629 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" Dec 03 14:25:22 crc kubenswrapper[5004]: I1203 14:25:22.842406 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:25:22 crc kubenswrapper[5004]: I1203 14:25:22.901226 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.024844 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc\") pod \"d63087ad-f25b-4292-ba82-d7df8e313480\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025170 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s88cc\" (UniqueName: \"kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc\") pod \"17736995-f3a0-4c44-b71a-db92f1371baa\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025243 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf8ln\" (UniqueName: \"kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln\") pod \"d63087ad-f25b-4292-ba82-d7df8e313480\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025342 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config\") pod \"d63087ad-f25b-4292-ba82-d7df8e313480\" (UID: \"d63087ad-f25b-4292-ba82-d7df8e313480\") " Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025370 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config\") pod \"17736995-f3a0-4c44-b71a-db92f1371baa\" (UID: \"17736995-f3a0-4c44-b71a-db92f1371baa\") " Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025438 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d63087ad-f25b-4292-ba82-d7df8e313480" (UID: "d63087ad-f25b-4292-ba82-d7df8e313480"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.025722 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.026102 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config" (OuterVolumeSpecName: "config") pod "d63087ad-f25b-4292-ba82-d7df8e313480" (UID: "d63087ad-f25b-4292-ba82-d7df8e313480"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.026202 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config" (OuterVolumeSpecName: "config") pod "17736995-f3a0-4c44-b71a-db92f1371baa" (UID: "17736995-f3a0-4c44-b71a-db92f1371baa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.030340 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc" (OuterVolumeSpecName: "kube-api-access-s88cc") pod "17736995-f3a0-4c44-b71a-db92f1371baa" (UID: "17736995-f3a0-4c44-b71a-db92f1371baa"). InnerVolumeSpecName "kube-api-access-s88cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.031056 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln" (OuterVolumeSpecName: "kube-api-access-xf8ln") pod "d63087ad-f25b-4292-ba82-d7df8e313480" (UID: "d63087ad-f25b-4292-ba82-d7df8e313480"). InnerVolumeSpecName "kube-api-access-xf8ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.127082 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf8ln\" (UniqueName: \"kubernetes.io/projected/d63087ad-f25b-4292-ba82-d7df8e313480-kube-api-access-xf8ln\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.127119 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63087ad-f25b-4292-ba82-d7df8e313480-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.127135 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17736995-f3a0-4c44-b71a-db92f1371baa-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.127147 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s88cc\" (UniqueName: \"kubernetes.io/projected/17736995-f3a0-4c44-b71a-db92f1371baa-kube-api-access-s88cc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.132695 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.144703 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zdz2r"] Dec 03 14:25:23 crc kubenswrapper[5004]: W1203 14:25:23.149108 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf66a90_3f7d_4170_8dab_9ff58ba576a3.slice/crio-f367b4e83806755904f72c9e6758eb6e2abc87561dc87b22cb895c8e45d62c84 WatchSource:0}: Error finding container f367b4e83806755904f72c9e6758eb6e2abc87561dc87b22cb895c8e45d62c84: Status 404 returned error can't find the container with id f367b4e83806755904f72c9e6758eb6e2abc87561dc87b22cb895c8e45d62c84 Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.257916 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.262791 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 14:25:23 crc kubenswrapper[5004]: W1203 14:25:23.273155 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83c3d56e_3bcd_407c_97e1_113485660567.slice/crio-f87674e3135e3d3818d06c817d0eeca95b22a7f2e49ed9890e8e73cf97e3d043 WatchSource:0}: Error finding container f87674e3135e3d3818d06c817d0eeca95b22a7f2e49ed9890e8e73cf97e3d043: Status 404 returned error can't find the container with id f87674e3135e3d3818d06c817d0eeca95b22a7f2e49ed9890e8e73cf97e3d043 Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.362635 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-65kf4"] Dec 03 14:25:23 crc kubenswrapper[5004]: W1203 14:25:23.402730 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod038a0c7b_ce5f_481a_b716_e6b5f3077655.slice/crio-6ca2aba4ec87f49b28cecd7ec66199432a4826d60f73a513d2afcb2d0238dcd7 WatchSource:0}: Error finding container 6ca2aba4ec87f49b28cecd7ec66199432a4826d60f73a513d2afcb2d0238dcd7: Status 404 returned error can't find the container with id 6ca2aba4ec87f49b28cecd7ec66199432a4826d60f73a513d2afcb2d0238dcd7 Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.434146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"83c3d56e-3bcd-407c-97e1-113485660567","Type":"ContainerStarted","Data":"f87674e3135e3d3818d06c817d0eeca95b22a7f2e49ed9890e8e73cf97e3d043"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.435359 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" event={"ID":"17736995-f3a0-4c44-b71a-db92f1371baa","Type":"ContainerDied","Data":"f66381006d46adbb80083f04f74179d1dbd6cb194260d02a79acdaed80d07416"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.435396 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jh8vm" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.439796 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65kf4" event={"ID":"038a0c7b-ce5f-481a-b716-e6b5f3077655","Type":"ContainerStarted","Data":"6ca2aba4ec87f49b28cecd7ec66199432a4826d60f73a513d2afcb2d0238dcd7"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.441282 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.441279 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9srd9" event={"ID":"d63087ad-f25b-4292-ba82-d7df8e313480","Type":"ContainerDied","Data":"6359c9dfe8af5901565cb839295c040bcf57631d19462b51a783205899be1825"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.442659 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd","Type":"ContainerStarted","Data":"e243d8535961cf55fafe3e2da7aa1ed788b2b7642d3d4649aa9f7807e0ed9ef7"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.444105 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r" event={"ID":"9cf66a90-3f7d-4170-8dab-9ff58ba576a3","Type":"ContainerStarted","Data":"f367b4e83806755904f72c9e6758eb6e2abc87561dc87b22cb895c8e45d62c84"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.446276 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bfe51e0-df6b-446f-9647-d9165f3cdead","Type":"ContainerStarted","Data":"bd151e90e71e83fbb726ba6323e405fb1dafd5a7eccda1a9d6a4df4ed8e7974f"} Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.524068 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.533221 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9srd9"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.553398 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.558955 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.563990 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jh8vm"] Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.659967 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17736995-f3a0-4c44-b71a-db92f1371baa" path="/var/lib/kubelet/pods/17736995-f3a0-4c44-b71a-db92f1371baa/volumes" Dec 03 14:25:23 crc kubenswrapper[5004]: I1203 14:25:23.660407 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63087ad-f25b-4292-ba82-d7df8e313480" path="/var/lib/kubelet/pods/d63087ad-f25b-4292-ba82-d7df8e313480/volumes" Dec 03 14:25:24 crc kubenswrapper[5004]: I1203 14:25:24.182110 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 14:25:24 crc kubenswrapper[5004]: I1203 14:25:24.453187 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"116764ff-36a4-444f-8051-e93b94a548fd","Type":"ContainerStarted","Data":"4f672ae61b3d626314881ea4cde499f3fad2aaa8f3f40fbdd28ec4cfd31f617a"} Dec 03 14:25:24 crc kubenswrapper[5004]: I1203 14:25:24.454937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerStarted","Data":"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9"} Dec 03 14:25:25 crc kubenswrapper[5004]: I1203 14:25:25.464264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerStarted","Data":"885deb67571b0f40c6fdcdd93d6440c32639996ce8f2cef0da52a94aa94e93d5"} Dec 03 14:25:25 crc kubenswrapper[5004]: W1203 14:25:25.602586 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d5df9d0_5ee6_4981_86b4_e90415206ceb.slice/crio-e9e77ff261a2cb4d036480e493a99962925027eabc7d0ef23b3228834c1846d1 WatchSource:0}: Error finding container e9e77ff261a2cb4d036480e493a99962925027eabc7d0ef23b3228834c1846d1: Status 404 returned error can't find the container with id e9e77ff261a2cb4d036480e493a99962925027eabc7d0ef23b3228834c1846d1 Dec 03 14:25:26 crc kubenswrapper[5004]: I1203 14:25:26.476069 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4d5df9d0-5ee6-4981-86b4-e90415206ceb","Type":"ContainerStarted","Data":"e9e77ff261a2cb4d036480e493a99962925027eabc7d0ef23b3228834c1846d1"} Dec 03 14:25:27 crc kubenswrapper[5004]: I1203 14:25:27.483681 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"83c3d56e-3bcd-407c-97e1-113485660567","Type":"ContainerStarted","Data":"6319f46d4ee6e2af384914d5105894afd5f405cf4e4c667eae69a4b5b4f76d5d"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.510260 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3bfe51e0-df6b-446f-9647-d9165f3cdead","Type":"ContainerStarted","Data":"63512d930d52b1e4ea1e824b2f83b1d963417920c5b5a3fe028ce110573f3c52"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.511061 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.513986 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"116764ff-36a4-444f-8051-e93b94a548fd","Type":"ContainerStarted","Data":"29a325357ab6928732c1ce962c59ba5cc38da6e24c444efded02407f12b39683"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.516713 5004 generic.go:334] "Generic (PLEG): container finished" podID="038a0c7b-ce5f-481a-b716-e6b5f3077655" containerID="f2f73cbcf7451c6fd308eaf93c7dc8fa3412875df485ed3681984985aaed7ec6" exitCode=0 Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.516799 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65kf4" event={"ID":"038a0c7b-ce5f-481a-b716-e6b5f3077655","Type":"ContainerDied","Data":"f2f73cbcf7451c6fd308eaf93c7dc8fa3412875df485ed3681984985aaed7ec6"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.518804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd","Type":"ContainerStarted","Data":"c69d72633e6999d234db76423d2f8667cbed3c362e71754589644013ec0bb721"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.518936 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.521371 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4d5df9d0-5ee6-4981-86b4-e90415206ceb","Type":"ContainerStarted","Data":"32c77c9351be44663946236c6ee48ce707780a3b80e52aae29295bda7580af9a"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.530000 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8","Type":"ContainerStarted","Data":"56bf7da8b0127155a032d591fee238d414099ad12294f6b9ea4b54bcf84a8704"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.534743 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r" event={"ID":"9cf66a90-3f7d-4170-8dab-9ff58ba576a3","Type":"ContainerStarted","Data":"ba9fc9ec23ce9971e053abfb3046b3590cd6c18ea15cc0ef7ff9aa426cf0b748"} Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.534819 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zdz2r" Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.554371 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.518773051 podStartE2EDuration="26.554343303s" podCreationTimestamp="2025-12-03 14:25:04 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.13049977 +0000 UTC m=+1135.879470006" lastFinishedPulling="2025-12-03 14:25:29.166070022 +0000 UTC m=+1141.915040258" observedRunningTime="2025-12-03 14:25:30.532191999 +0000 UTC m=+1143.281162255" watchObservedRunningTime="2025-12-03 14:25:30.554343303 +0000 UTC m=+1143.303313549" Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.555263 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.288776427 podStartE2EDuration="24.555256579s" podCreationTimestamp="2025-12-03 14:25:06 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.268752798 +0000 UTC m=+1136.017723034" lastFinishedPulling="2025-12-03 14:25:29.53523295 +0000 UTC m=+1142.284203186" observedRunningTime="2025-12-03 14:25:30.546926751 +0000 UTC m=+1143.295897017" watchObservedRunningTime="2025-12-03 14:25:30.555256579 +0000 UTC m=+1143.304226825" Dec 03 14:25:30 crc kubenswrapper[5004]: I1203 14:25:30.622293 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zdz2r" podStartSLOduration=14.245133988 podStartE2EDuration="20.622270768s" podCreationTimestamp="2025-12-03 14:25:10 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.151379538 +0000 UTC m=+1135.900349774" lastFinishedPulling="2025-12-03 14:25:29.528516318 +0000 UTC m=+1142.277486554" observedRunningTime="2025-12-03 14:25:30.617771429 +0000 UTC m=+1143.366741665" watchObservedRunningTime="2025-12-03 14:25:30.622270768 +0000 UTC m=+1143.371241004" Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.549389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65kf4" event={"ID":"038a0c7b-ce5f-481a-b716-e6b5f3077655","Type":"ContainerStarted","Data":"57304a91324cca754aa572cdf32ffe8b4fbd53f8d1f2652cde47b3c3d1b4785f"} Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.550651 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-65kf4" event={"ID":"038a0c7b-ce5f-481a-b716-e6b5f3077655","Type":"ContainerStarted","Data":"5c23ad8dc4ae4a8a12e7f85e11463a8a5cd89268b7cd1b5e31aa099883a9d7ba"} Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.550727 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.552559 5004 generic.go:334] "Generic (PLEG): container finished" podID="83c3d56e-3bcd-407c-97e1-113485660567" containerID="6319f46d4ee6e2af384914d5105894afd5f405cf4e4c667eae69a4b5b4f76d5d" exitCode=0 Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.552734 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"83c3d56e-3bcd-407c-97e1-113485660567","Type":"ContainerDied","Data":"6319f46d4ee6e2af384914d5105894afd5f405cf4e4c667eae69a4b5b4f76d5d"} Dec 03 14:25:31 crc kubenswrapper[5004]: I1203 14:25:31.584061 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-65kf4" podStartSLOduration=15.465582156 podStartE2EDuration="21.584039751s" podCreationTimestamp="2025-12-03 14:25:10 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.404887505 +0000 UTC m=+1136.153857741" lastFinishedPulling="2025-12-03 14:25:29.5233451 +0000 UTC m=+1142.272315336" observedRunningTime="2025-12-03 14:25:31.573274292 +0000 UTC m=+1144.322244548" watchObservedRunningTime="2025-12-03 14:25:31.584039751 +0000 UTC m=+1144.333009987" Dec 03 14:25:32 crc kubenswrapper[5004]: I1203 14:25:32.565455 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"83c3d56e-3bcd-407c-97e1-113485660567","Type":"ContainerStarted","Data":"fc08a1a9352751f98d81a48550a8c591fc9af00d9cf8416b40853335960014ab"} Dec 03 14:25:32 crc kubenswrapper[5004]: I1203 14:25:32.565902 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:25:32 crc kubenswrapper[5004]: I1203 14:25:32.585352 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.181064136 podStartE2EDuration="29.585331485s" podCreationTimestamp="2025-12-03 14:25:03 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.274811241 +0000 UTC m=+1136.023781477" lastFinishedPulling="2025-12-03 14:25:25.67907859 +0000 UTC m=+1138.428048826" observedRunningTime="2025-12-03 14:25:32.583924335 +0000 UTC m=+1145.332894581" watchObservedRunningTime="2025-12-03 14:25:32.585331485 +0000 UTC m=+1145.334301721" Dec 03 14:25:33 crc kubenswrapper[5004]: I1203 14:25:33.576711 5004 generic.go:334] "Generic (PLEG): container finished" podID="affd9c16-d0c4-4c54-b438-bdb3a4cafdd8" containerID="56bf7da8b0127155a032d591fee238d414099ad12294f6b9ea4b54bcf84a8704" exitCode=0 Dec 03 14:25:33 crc kubenswrapper[5004]: I1203 14:25:33.576881 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8","Type":"ContainerDied","Data":"56bf7da8b0127155a032d591fee238d414099ad12294f6b9ea4b54bcf84a8704"} Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.586212 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4d5df9d0-5ee6-4981-86b4-e90415206ceb","Type":"ContainerStarted","Data":"5f09256dbc77f843c318a5bf9b002b71231c46a6d65152396fec2e0d7a2eca74"} Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.588482 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"116764ff-36a4-444f-8051-e93b94a548fd","Type":"ContainerStarted","Data":"b3ed7ba96c47afca91c41d5e4c15458be8bab2bd2a5d7b4a2220b01525ae0fd6"} Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.590258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"affd9c16-d0c4-4c54-b438-bdb3a4cafdd8","Type":"ContainerStarted","Data":"fcbd0d43fd4671b67fa8beb8e20743aac4905bb8949cc883471a0e6052803fda"} Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.617570 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.041710941 podStartE2EDuration="22.617547731s" podCreationTimestamp="2025-12-03 14:25:12 +0000 UTC" firstStartedPulling="2025-12-03 14:25:25.609869088 +0000 UTC m=+1138.358839324" lastFinishedPulling="2025-12-03 14:25:34.185705878 +0000 UTC m=+1146.934676114" observedRunningTime="2025-12-03 14:25:34.61160165 +0000 UTC m=+1147.360571906" watchObservedRunningTime="2025-12-03 14:25:34.617547731 +0000 UTC m=+1147.366517987" Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.634758 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.153712412 podStartE2EDuration="25.634733303s" podCreationTimestamp="2025-12-03 14:25:09 +0000 UTC" firstStartedPulling="2025-12-03 14:25:23.722004043 +0000 UTC m=+1136.470974279" lastFinishedPulling="2025-12-03 14:25:34.203024934 +0000 UTC m=+1146.951995170" observedRunningTime="2025-12-03 14:25:34.634137876 +0000 UTC m=+1147.383108132" watchObservedRunningTime="2025-12-03 14:25:34.634733303 +0000 UTC m=+1147.383703549" Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.666090 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.433786154 podStartE2EDuration="32.6660694s" podCreationTimestamp="2025-12-03 14:25:02 +0000 UTC" firstStartedPulling="2025-12-03 14:25:20.853689872 +0000 UTC m=+1133.602660108" lastFinishedPulling="2025-12-03 14:25:26.085973118 +0000 UTC m=+1138.834943354" observedRunningTime="2025-12-03 14:25:34.656930158 +0000 UTC m=+1147.405900534" watchObservedRunningTime="2025-12-03 14:25:34.6660694 +0000 UTC m=+1147.415039646" Dec 03 14:25:34 crc kubenswrapper[5004]: I1203 14:25:34.962224 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.034225 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.034305 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.283918 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.326811 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.350423 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.392244 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.596848 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.597263 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.633953 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.644719 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.912208 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.950978 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-x4sdg"] Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.952001 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.961483 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 14:25:35 crc kubenswrapper[5004]: I1203 14:25:35.994715 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.003842 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.006902 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.007291 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.014820 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-x4sdg"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.040611 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovs-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.040791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-config\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.040817 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovn-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.040911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-combined-ca-bundle\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.040964 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmddv\" (UniqueName: \"kubernetes.io/projected/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-kube-api-access-bmddv\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.041083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.071023 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.072376 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.076407 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.076704 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.076823 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gns8f" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.076941 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.084352 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.098187 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.137733 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.139475 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.141999 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142271 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142356 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-combined-ca-bundle\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142403 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142429 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmddv\" (UniqueName: \"kubernetes.io/projected/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-kube-api-access-bmddv\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142501 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142534 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142572 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-scripts\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142602 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovs-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142624 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142655 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-config\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142694 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntp5p\" (UniqueName: \"kubernetes.io/projected/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-kube-api-access-ntp5p\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142717 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcmcg\" (UniqueName: \"kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142742 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142779 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-config\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142804 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142827 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.142871 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovn-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.143152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovs-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.143201 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-ovn-rundir\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.144146 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-config\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.150936 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.162253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-combined-ca-bundle\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.167773 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.171623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmddv\" (UniqueName: \"kubernetes.io/projected/01b52e65-ccad-48ac-91d0-b5b9fb3905cd-kube-api-access-bmddv\") pod \"ovn-controller-metrics-x4sdg\" (UID: \"01b52e65-ccad-48ac-91d0-b5b9fb3905cd\") " pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.246116 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247117 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntp5p\" (UniqueName: \"kubernetes.io/projected/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-kube-api-access-ntp5p\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcmcg\" (UniqueName: \"kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247160 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247204 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247220 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247404 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247446 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247483 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247553 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247574 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-scripts\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247594 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cgj\" (UniqueName: \"kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247610 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.247631 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-config\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.248978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-config\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.249124 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.249740 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.249804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.249874 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-scripts\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.256477 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.258426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.265571 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.276674 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntp5p\" (UniqueName: \"kubernetes.io/projected/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-kube-api-access-ntp5p\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.276745 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcmcg\" (UniqueName: \"kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg\") pod \"dnsmasq-dns-7f896c8c65-jhlpj\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.278562 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9af36c08-ab5c-4a97-88d3-a7ef2f032faf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9af36c08-ab5c-4a97-88d3-a7ef2f032faf\") " pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.280484 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-x4sdg" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.333970 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.349174 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.349272 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.349335 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.349369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cgj\" (UniqueName: \"kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.349410 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.350401 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.350530 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.350874 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.351771 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.377660 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.378280 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cgj\" (UniqueName: \"kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj\") pod \"dnsmasq-dns-86db49b7ff-bwgl6\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.391653 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.450659 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvrfd\" (UniqueName: \"kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd\") pod \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.451028 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc\") pod \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.451134 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config\") pod \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\" (UID: \"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.452533 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" (UID: "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.453288 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config" (OuterVolumeSpecName: "config") pod "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" (UID: "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.459057 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd" (OuterVolumeSpecName: "kube-api-access-xvrfd") pod "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" (UID: "38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a"). InnerVolumeSpecName "kube-api-access-xvrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.540793 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.543343 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.553557 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvrfd\" (UniqueName: \"kubernetes.io/projected/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-kube-api-access-xvrfd\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.553641 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.553651 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.621767 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" event={"ID":"38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a","Type":"ContainerDied","Data":"719259b9781b7c53304d9b921f84d11733a77b9e2420476d8ab1c01ef6f637f1"} Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.622211 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-72mlm" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.628683 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.629043 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dd59j" event={"ID":"c3c0fb8a-87cc-4b30-894f-a6d6b180a636","Type":"ContainerDied","Data":"dc90469645a82aea019c0671edada83091de7c1a32deec206255f3bc072babd3"} Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.654979 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config\") pod \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.655051 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k7w4\" (UniqueName: \"kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4\") pod \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.655109 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc\") pod \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\" (UID: \"c3c0fb8a-87cc-4b30-894f-a6d6b180a636\") " Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.655690 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config" (OuterVolumeSpecName: "config") pod "c3c0fb8a-87cc-4b30-894f-a6d6b180a636" (UID: "c3c0fb8a-87cc-4b30-894f-a6d6b180a636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.656277 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3c0fb8a-87cc-4b30-894f-a6d6b180a636" (UID: "c3c0fb8a-87cc-4b30-894f-a6d6b180a636"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.668811 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4" (OuterVolumeSpecName: "kube-api-access-9k7w4") pod "c3c0fb8a-87cc-4b30-894f-a6d6b180a636" (UID: "c3c0fb8a-87cc-4b30-894f-a6d6b180a636"). InnerVolumeSpecName "kube-api-access-9k7w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.727848 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.745724 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-72mlm"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.767044 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k7w4\" (UniqueName: \"kubernetes.io/projected/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-kube-api-access-9k7w4\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.767087 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.767101 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c0fb8a-87cc-4b30-894f-a6d6b180a636-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.783390 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.798409 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-x4sdg"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.905602 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.918080 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.925828 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.928508 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.961590 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.979832 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.980074 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.980165 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.980220 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.980311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:36 crc kubenswrapper[5004]: I1203 14:25:36.980424 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm44z\" (UniqueName: \"kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.000973 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.032996 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.047288 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dd59j"] Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.082469 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.082555 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.082597 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.082641 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm44z\" (UniqueName: \"kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.082716 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.083655 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.084385 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.084791 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.085393 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.103003 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm44z\" (UniqueName: \"kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z\") pod \"dnsmasq-dns-698758b865-rzqx7\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.270495 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.308363 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.413382 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.622795 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a" path="/var/lib/kubelet/pods/38ddfa3a-a5ef-40f7-8920-ba29e6ca5b1a/volumes" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.623351 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c0fb8a-87cc-4b30-894f-a6d6b180a636" path="/var/lib/kubelet/pods/c3c0fb8a-87cc-4b30-894f-a6d6b180a636/volumes" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.636897 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" event={"ID":"c01d10f0-2257-4eb5-a5dc-fce48e63103c","Type":"ContainerStarted","Data":"1cf8dfb45301a92791cd72104205d42d28f1fa800f22b4fab108708f1f12f21b"} Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.638648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" event={"ID":"c4a2fd57-f7c1-41bf-871f-6733b6a5f967","Type":"ContainerStarted","Data":"9072c3e01541dcd8899740e8e4be440e3b13d8fbef5458f136a2bf0eb2aa6e18"} Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.639947 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9af36c08-ab5c-4a97-88d3-a7ef2f032faf","Type":"ContainerStarted","Data":"5a0e0870d8e6f4e28dbe5d0219af332ae4e9157263a19fda3f06faf6941fca40"} Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.641837 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-x4sdg" event={"ID":"01b52e65-ccad-48ac-91d0-b5b9fb3905cd","Type":"ContainerStarted","Data":"7f7aa98251078605a5b2044844d84d291106f4a44e324382a50d9944b4cdc965"} Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.641912 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-x4sdg" event={"ID":"01b52e65-ccad-48ac-91d0-b5b9fb3905cd","Type":"ContainerStarted","Data":"bb5e713533e9792843191e405a71f9fb256602936709708288d9ce8581e2e5ab"} Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.724170 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-x4sdg" podStartSLOduration=2.724148443 podStartE2EDuration="2.724148443s" podCreationTimestamp="2025-12-03 14:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:25:37.718332457 +0000 UTC m=+1150.467302693" watchObservedRunningTime="2025-12-03 14:25:37.724148443 +0000 UTC m=+1150.473118729" Dec 03 14:25:37 crc kubenswrapper[5004]: I1203 14:25:37.782410 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.081711 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.090375 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.092521 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qb5bc" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.092669 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.092826 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.093267 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.105345 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.207778 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.208040 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-lock\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.208125 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-cache\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.208148 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfpk\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-kube-api-access-4zfpk\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.208181 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.309783 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-cache\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.309836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfpk\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-kube-api-access-4zfpk\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.309878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.309905 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.309936 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-lock\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.310445 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-cache\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.310482 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b45d92a5-2abb-421d-826f-185ac63f4661-lock\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.310581 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.310594 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.310633 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:25:38.810616812 +0000 UTC m=+1151.559587048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.310824 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.353149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfpk\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-kube-api-access-4zfpk\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.405134 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.615421 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7xgl7"] Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.616368 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.618914 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.619075 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.622157 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.628442 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7xgl7"] Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.650890 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rzqx7" event={"ID":"e604b4a5-cf48-4060-b7f2-556bca7840d3","Type":"ContainerStarted","Data":"fe3e5b36e0ce02efaffbb0dea40947422a194f94e8b0914e9c4fbd3514ec68f9"} Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717037 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717124 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717392 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717432 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717451 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.717489 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rp2b\" (UniqueName: \"kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.818885 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.818926 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.818953 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rp2b\" (UniqueName: \"kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.818973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819026 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819046 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819062 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819128 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.819218 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.819253 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: E1203 14:25:38.819307 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:25:39.819288124 +0000 UTC m=+1152.568258360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819712 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819966 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.819978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.824243 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.824326 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.826022 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.839819 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rp2b\" (UniqueName: \"kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b\") pod \"swift-ring-rebalance-7xgl7\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:38 crc kubenswrapper[5004]: I1203 14:25:38.935097 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:39 crc kubenswrapper[5004]: I1203 14:25:39.398738 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7xgl7"] Dec 03 14:25:39 crc kubenswrapper[5004]: W1203 14:25:39.398843 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9137320_4b52_422f_a96b_34c555c55aa6.slice/crio-5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0 WatchSource:0}: Error finding container 5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0: Status 404 returned error can't find the container with id 5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0 Dec 03 14:25:39 crc kubenswrapper[5004]: I1203 14:25:39.659083 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7xgl7" event={"ID":"f9137320-4b52-422f-a96b-34c555c55aa6","Type":"ContainerStarted","Data":"5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0"} Dec 03 14:25:39 crc kubenswrapper[5004]: I1203 14:25:39.834105 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:39 crc kubenswrapper[5004]: E1203 14:25:39.834346 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:39 crc kubenswrapper[5004]: E1203 14:25:39.834361 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:39 crc kubenswrapper[5004]: E1203 14:25:39.834405 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:25:41.834390353 +0000 UTC m=+1154.583360589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:41 crc kubenswrapper[5004]: I1203 14:25:41.869020 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:41 crc kubenswrapper[5004]: E1203 14:25:41.869563 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:41 crc kubenswrapper[5004]: E1203 14:25:41.869579 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:41 crc kubenswrapper[5004]: E1203 14:25:41.869622 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:25:45.869608655 +0000 UTC m=+1158.618578891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:43 crc kubenswrapper[5004]: I1203 14:25:43.550153 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 14:25:43 crc kubenswrapper[5004]: I1203 14:25:43.550501 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 14:25:44 crc kubenswrapper[5004]: I1203 14:25:44.910169 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.001622 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.344437 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2228-account-create-update-zr2sq"] Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.349168 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.351199 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2228-account-create-update-zr2sq"] Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.352946 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.439065 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrds\" (UniqueName: \"kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.439302 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.540543 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.540649 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrds\" (UniqueName: \"kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.541446 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.564615 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrds\" (UniqueName: \"kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds\") pod \"glance-2228-account-create-update-zr2sq\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.690047 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.714978 5004 generic.go:334] "Generic (PLEG): container finished" podID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerID="d608c980c134d9a189e869f1028e1e66ff452f3c9ff41a542698436fb170e3db" exitCode=0 Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.715288 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rzqx7" event={"ID":"e604b4a5-cf48-4060-b7f2-556bca7840d3","Type":"ContainerDied","Data":"d608c980c134d9a189e869f1028e1e66ff452f3c9ff41a542698436fb170e3db"} Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.723373 5004 generic.go:334] "Generic (PLEG): container finished" podID="c01d10f0-2257-4eb5-a5dc-fce48e63103c" containerID="7df9c988abd777b5297c6ab2b30ccf458b94875541accccc62e6eb5b2a9edbc0" exitCode=0 Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.723436 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" event={"ID":"c01d10f0-2257-4eb5-a5dc-fce48e63103c","Type":"ContainerDied","Data":"7df9c988abd777b5297c6ab2b30ccf458b94875541accccc62e6eb5b2a9edbc0"} Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.727083 5004 generic.go:334] "Generic (PLEG): container finished" podID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerID="465d86b794a1263aacd13fbd7d927b3cd1c4a7437478104ba4ef80557d1b03b0" exitCode=0 Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.727931 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" event={"ID":"c4a2fd57-f7c1-41bf-871f-6733b6a5f967","Type":"ContainerDied","Data":"465d86b794a1263aacd13fbd7d927b3cd1c4a7437478104ba4ef80557d1b03b0"} Dec 03 14:25:45 crc kubenswrapper[5004]: I1203 14:25:45.948838 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:45 crc kubenswrapper[5004]: E1203 14:25:45.948897 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:45 crc kubenswrapper[5004]: E1203 14:25:45.948916 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:45 crc kubenswrapper[5004]: E1203 14:25:45.948971 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:25:53.948951274 +0000 UTC m=+1166.697921510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:46 crc kubenswrapper[5004]: I1203 14:25:46.274838 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2228-account-create-update-zr2sq"] Dec 03 14:25:46 crc kubenswrapper[5004]: I1203 14:25:46.741093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9af36c08-ab5c-4a97-88d3-a7ef2f032faf","Type":"ContainerStarted","Data":"1d48154cb706144ce00f67ad1d8d0ca0a48c18c4ad7b17c9a20bbe4e4709b113"} Dec 03 14:25:46 crc kubenswrapper[5004]: I1203 14:25:46.743465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rzqx7" event={"ID":"e604b4a5-cf48-4060-b7f2-556bca7840d3","Type":"ContainerStarted","Data":"2ca4bb0cdf21fa6794a7b1bd8a34aec197c49285cc29973851b12e761801543a"} Dec 03 14:25:46 crc kubenswrapper[5004]: I1203 14:25:46.743683 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:46 crc kubenswrapper[5004]: I1203 14:25:46.769068 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podStartSLOduration=3.747579558 podStartE2EDuration="10.769044901s" podCreationTimestamp="2025-12-03 14:25:36 +0000 UTC" firstStartedPulling="2025-12-03 14:25:37.785071437 +0000 UTC m=+1150.534041673" lastFinishedPulling="2025-12-03 14:25:44.80653678 +0000 UTC m=+1157.555507016" observedRunningTime="2025-12-03 14:25:46.764884402 +0000 UTC m=+1159.513854648" watchObservedRunningTime="2025-12-03 14:25:46.769044901 +0000 UTC m=+1159.518015137" Dec 03 14:25:47 crc kubenswrapper[5004]: E1203 14:25:47.900842 5004 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 03 14:25:47 crc kubenswrapper[5004]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c4a2fd57-f7c1-41bf-871f-6733b6a5f967/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 14:25:47 crc kubenswrapper[5004]: > podSandboxID="9072c3e01541dcd8899740e8e4be440e3b13d8fbef5458f136a2bf0eb2aa6e18" Dec 03 14:25:47 crc kubenswrapper[5004]: E1203 14:25:47.901309 5004 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 14:25:47 crc kubenswrapper[5004]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2cgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-bwgl6_openstack(c4a2fd57-f7c1-41bf-871f-6733b6a5f967): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c4a2fd57-f7c1-41bf-871f-6733b6a5f967/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 14:25:47 crc kubenswrapper[5004]: > logger="UnhandledError" Dec 03 14:25:47 crc kubenswrapper[5004]: E1203 14:25:47.902478 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c4a2fd57-f7c1-41bf-871f-6733b6a5f967/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.195734 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.286099 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config\") pod \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.286254 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc\") pod \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.286306 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcmcg\" (UniqueName: \"kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg\") pod \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.286384 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb\") pod \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\" (UID: \"c01d10f0-2257-4eb5-a5dc-fce48e63103c\") " Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.290577 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg" (OuterVolumeSpecName: "kube-api-access-bcmcg") pod "c01d10f0-2257-4eb5-a5dc-fce48e63103c" (UID: "c01d10f0-2257-4eb5-a5dc-fce48e63103c"). InnerVolumeSpecName "kube-api-access-bcmcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.311770 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config" (OuterVolumeSpecName: "config") pod "c01d10f0-2257-4eb5-a5dc-fce48e63103c" (UID: "c01d10f0-2257-4eb5-a5dc-fce48e63103c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.317350 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c01d10f0-2257-4eb5-a5dc-fce48e63103c" (UID: "c01d10f0-2257-4eb5-a5dc-fce48e63103c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.318557 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c01d10f0-2257-4eb5-a5dc-fce48e63103c" (UID: "c01d10f0-2257-4eb5-a5dc-fce48e63103c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.388074 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.388108 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.388119 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcmcg\" (UniqueName: \"kubernetes.io/projected/c01d10f0-2257-4eb5-a5dc-fce48e63103c-kube-api-access-bcmcg\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.388128 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c01d10f0-2257-4eb5-a5dc-fce48e63103c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.761938 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9af36c08-ab5c-4a97-88d3-a7ef2f032faf","Type":"ContainerStarted","Data":"5f1387f85ae06f814939df6abc2a3f10c736b62803838e6080d220af62ab533e"} Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.762788 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.764163 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" event={"ID":"c01d10f0-2257-4eb5-a5dc-fce48e63103c","Type":"ContainerDied","Data":"1cf8dfb45301a92791cd72104205d42d28f1fa800f22b4fab108708f1f12f21b"} Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.764339 5004 scope.go:117] "RemoveContainer" containerID="7df9c988abd777b5297c6ab2b30ccf458b94875541accccc62e6eb5b2a9edbc0" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.764191 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-jhlpj" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.766127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7xgl7" event={"ID":"f9137320-4b52-422f-a96b-34c555c55aa6","Type":"ContainerStarted","Data":"11eca4cfaab7f3e1b6d108096524f03599b8d1bba46908709b8716c4b25becce"} Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.769094 5004 generic.go:334] "Generic (PLEG): container finished" podID="d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" containerID="0717f1a84b54ca455ede6730c25f68eccb072f25095deb51eac907caf59c8d66" exitCode=0 Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.769162 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2228-account-create-update-zr2sq" event={"ID":"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4","Type":"ContainerDied","Data":"0717f1a84b54ca455ede6730c25f68eccb072f25095deb51eac907caf59c8d66"} Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.769197 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2228-account-create-update-zr2sq" event={"ID":"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4","Type":"ContainerStarted","Data":"0164209f6213bd9fb8472f76305a50d177281e60fcfb7a6f0f2238519841a057"} Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.793481 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.068312161 podStartE2EDuration="12.793457534s" podCreationTimestamp="2025-12-03 14:25:36 +0000 UTC" firstStartedPulling="2025-12-03 14:25:36.978973902 +0000 UTC m=+1149.727944138" lastFinishedPulling="2025-12-03 14:25:45.704119275 +0000 UTC m=+1158.453089511" observedRunningTime="2025-12-03 14:25:48.789024327 +0000 UTC m=+1161.537994593" watchObservedRunningTime="2025-12-03 14:25:48.793457534 +0000 UTC m=+1161.542427790" Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.849023 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.857162 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-jhlpj"] Dec 03 14:25:48 crc kubenswrapper[5004]: I1203 14:25:48.872777 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7xgl7" podStartSLOduration=2.328611892 podStartE2EDuration="10.872761304s" podCreationTimestamp="2025-12-03 14:25:38 +0000 UTC" firstStartedPulling="2025-12-03 14:25:39.401947414 +0000 UTC m=+1152.150917650" lastFinishedPulling="2025-12-03 14:25:47.946096826 +0000 UTC m=+1160.695067062" observedRunningTime="2025-12-03 14:25:48.859508995 +0000 UTC m=+1161.608479231" watchObservedRunningTime="2025-12-03 14:25:48.872761304 +0000 UTC m=+1161.621731540" Dec 03 14:25:49 crc kubenswrapper[5004]: I1203 14:25:49.625884 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01d10f0-2257-4eb5-a5dc-fce48e63103c" path="/var/lib/kubelet/pods/c01d10f0-2257-4eb5-a5dc-fce48e63103c/volumes" Dec 03 14:25:49 crc kubenswrapper[5004]: I1203 14:25:49.782196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" event={"ID":"c4a2fd57-f7c1-41bf-871f-6733b6a5f967","Type":"ContainerStarted","Data":"67f6eb9d3b4b1f8845468378f3a8e0d565afba742da65172f26ef69921c09599"} Dec 03 14:25:49 crc kubenswrapper[5004]: I1203 14:25:49.809463 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" podStartSLOduration=6.040576201 podStartE2EDuration="13.809445359s" podCreationTimestamp="2025-12-03 14:25:36 +0000 UTC" firstStartedPulling="2025-12-03 14:25:37.039836874 +0000 UTC m=+1149.788807110" lastFinishedPulling="2025-12-03 14:25:44.808706032 +0000 UTC m=+1157.557676268" observedRunningTime="2025-12-03 14:25:49.800279136 +0000 UTC m=+1162.549249382" watchObservedRunningTime="2025-12-03 14:25:49.809445359 +0000 UTC m=+1162.558415595" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.115235 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.217232 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzrds\" (UniqueName: \"kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds\") pod \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.217335 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts\") pod \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\" (UID: \"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4\") " Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.218214 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" (UID: "d30fd1af-dedb-4b6c-a3fd-5a327ac580e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.223375 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds" (OuterVolumeSpecName: "kube-api-access-lzrds") pod "d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" (UID: "d30fd1af-dedb-4b6c-a3fd-5a327ac580e4"). InnerVolumeSpecName "kube-api-access-lzrds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.320375 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzrds\" (UniqueName: \"kubernetes.io/projected/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-kube-api-access-lzrds\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.320409 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.791041 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2228-account-create-update-zr2sq" event={"ID":"d30fd1af-dedb-4b6c-a3fd-5a327ac580e4","Type":"ContainerDied","Data":"0164209f6213bd9fb8472f76305a50d177281e60fcfb7a6f0f2238519841a057"} Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.791094 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0164209f6213bd9fb8472f76305a50d177281e60fcfb7a6f0f2238519841a057" Dec 03 14:25:50 crc kubenswrapper[5004]: I1203 14:25:50.791151 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2228-account-create-update-zr2sq" Dec 03 14:25:51 crc kubenswrapper[5004]: I1203 14:25:51.541174 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.274133 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.351421 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.351619 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="dnsmasq-dns" containerID="cri-o://67f6eb9d3b4b1f8845468378f3a8e0d565afba742da65172f26ef69921c09599" gracePeriod=10 Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.805933 5004 generic.go:334] "Generic (PLEG): container finished" podID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerID="67f6eb9d3b4b1f8845468378f3a8e0d565afba742da65172f26ef69921c09599" exitCode=0 Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.806014 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" event={"ID":"c4a2fd57-f7c1-41bf-871f-6733b6a5f967","Type":"ContainerDied","Data":"67f6eb9d3b4b1f8845468378f3a8e0d565afba742da65172f26ef69921c09599"} Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.806344 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" event={"ID":"c4a2fd57-f7c1-41bf-871f-6733b6a5f967","Type":"ContainerDied","Data":"9072c3e01541dcd8899740e8e4be440e3b13d8fbef5458f136a2bf0eb2aa6e18"} Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.806360 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9072c3e01541dcd8899740e8e4be440e3b13d8fbef5458f136a2bf0eb2aa6e18" Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.870300 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.966383 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2cgj\" (UniqueName: \"kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj\") pod \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.966441 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config\") pod \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.966483 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc\") pod \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.966527 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb\") pod \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.966627 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb\") pod \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\" (UID: \"c4a2fd57-f7c1-41bf-871f-6733b6a5f967\") " Dec 03 14:25:52 crc kubenswrapper[5004]: I1203 14:25:52.974544 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj" (OuterVolumeSpecName: "kube-api-access-p2cgj") pod "c4a2fd57-f7c1-41bf-871f-6733b6a5f967" (UID: "c4a2fd57-f7c1-41bf-871f-6733b6a5f967"). InnerVolumeSpecName "kube-api-access-p2cgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.009695 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config" (OuterVolumeSpecName: "config") pod "c4a2fd57-f7c1-41bf-871f-6733b6a5f967" (UID: "c4a2fd57-f7c1-41bf-871f-6733b6a5f967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.011825 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4a2fd57-f7c1-41bf-871f-6733b6a5f967" (UID: "c4a2fd57-f7c1-41bf-871f-6733b6a5f967"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.022597 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4a2fd57-f7c1-41bf-871f-6733b6a5f967" (UID: "c4a2fd57-f7c1-41bf-871f-6733b6a5f967"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.023587 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4a2fd57-f7c1-41bf-871f-6733b6a5f967" (UID: "c4a2fd57-f7c1-41bf-871f-6733b6a5f967"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.069336 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.069384 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2cgj\" (UniqueName: \"kubernetes.io/projected/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-kube-api-access-p2cgj\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.069400 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.069412 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.069424 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4a2fd57-f7c1-41bf-871f-6733b6a5f967-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.813615 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bwgl6" Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.833242 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.839767 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bwgl6"] Dec 03 14:25:53 crc kubenswrapper[5004]: I1203 14:25:53.983307 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:25:53 crc kubenswrapper[5004]: E1203 14:25:53.983475 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 14:25:53 crc kubenswrapper[5004]: E1203 14:25:53.983507 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 14:25:53 crc kubenswrapper[5004]: E1203 14:25:53.983574 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift podName:b45d92a5-2abb-421d-826f-185ac63f4661 nodeName:}" failed. No retries permitted until 2025-12-03 14:26:09.983555611 +0000 UTC m=+1182.732525847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift") pod "swift-storage-0" (UID: "b45d92a5-2abb-421d-826f-185ac63f4661") : configmap "swift-ring-files" not found Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.552260 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4wszb"] Dec 03 14:25:54 crc kubenswrapper[5004]: E1203 14:25:54.552669 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="dnsmasq-dns" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.552688 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="dnsmasq-dns" Dec 03 14:25:54 crc kubenswrapper[5004]: E1203 14:25:54.552717 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" containerName="mariadb-account-create-update" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.552724 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" containerName="mariadb-account-create-update" Dec 03 14:25:54 crc kubenswrapper[5004]: E1203 14:25:54.552734 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="init" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.552742 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="init" Dec 03 14:25:54 crc kubenswrapper[5004]: E1203 14:25:54.552755 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01d10f0-2257-4eb5-a5dc-fce48e63103c" containerName="init" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.552761 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01d10f0-2257-4eb5-a5dc-fce48e63103c" containerName="init" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.553181 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" containerName="dnsmasq-dns" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.553210 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" containerName="mariadb-account-create-update" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.553234 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01d10f0-2257-4eb5-a5dc-fce48e63103c" containerName="init" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.553949 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.566516 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4wszb"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.592480 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.592616 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htfsx\" (UniqueName: \"kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.640233 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4524-account-create-update-hvf8r"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.641597 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.643887 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.653129 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4524-account-create-update-hvf8r"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.694057 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.694166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.694214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrbb\" (UniqueName: \"kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.694284 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htfsx\" (UniqueName: \"kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.695320 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.713445 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htfsx\" (UniqueName: \"kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx\") pod \"keystone-db-create-4wszb\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.796196 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrbb\" (UniqueName: \"kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.796322 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.797076 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.815700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrbb\" (UniqueName: \"kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb\") pod \"keystone-4524-account-create-update-hvf8r\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.877200 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.882795 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pdq5g"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.884116 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.910230 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pdq5g"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.969123 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.987438 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2c8b-account-create-update-85hzs"] Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.988783 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.998305 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.999140 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:54 crc kubenswrapper[5004]: I1203 14:25:54.999201 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9g52\" (UniqueName: \"kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.013809 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2c8b-account-create-update-85hzs"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.100832 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.101641 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.101678 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rtm\" (UniqueName: \"kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.101713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9g52\" (UniqueName: \"kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.102399 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.113374 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-chkcv"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.114755 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.120393 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9g52\" (UniqueName: \"kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52\") pod \"placement-db-create-pdq5g\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.122045 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-chkcv"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.203759 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.203802 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.203829 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68p8\" (UniqueName: \"kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.203927 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8rtm\" (UniqueName: \"kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.204736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.219068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8rtm\" (UniqueName: \"kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm\") pod \"placement-2c8b-account-create-update-85hzs\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.269153 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.305765 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.306263 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68p8\" (UniqueName: \"kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.306702 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.321684 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68p8\" (UniqueName: \"kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8\") pod \"glance-db-create-chkcv\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.361633 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.378325 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4wszb"] Dec 03 14:25:55 crc kubenswrapper[5004]: W1203 14:25:55.383068 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd567a61_b577_4119_8bd1_59039ffc45e8.slice/crio-3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67 WatchSource:0}: Error finding container 3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67: Status 404 returned error can't find the container with id 3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67 Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.437399 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-chkcv" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.489149 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pdq5g"] Dec 03 14:25:55 crc kubenswrapper[5004]: W1203 14:25:55.499759 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d44f905_8c97_4b62_89e4_a3929a8a2042.slice/crio-c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89 WatchSource:0}: Error finding container c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89: Status 404 returned error can't find the container with id c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89 Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.503135 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4524-account-create-update-hvf8r"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.628137 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a2fd57-f7c1-41bf-871f-6733b6a5f967" path="/var/lib/kubelet/pods/c4a2fd57-f7c1-41bf-871f-6733b6a5f967/volumes" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.789303 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2c8b-account-create-update-85hzs"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.856008 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9137320-4b52-422f-a96b-34c555c55aa6" containerID="11eca4cfaab7f3e1b6d108096524f03599b8d1bba46908709b8716c4b25becce" exitCode=0 Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.856072 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7xgl7" event={"ID":"f9137320-4b52-422f-a96b-34c555c55aa6","Type":"ContainerDied","Data":"11eca4cfaab7f3e1b6d108096524f03599b8d1bba46908709b8716c4b25becce"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.859915 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pdq5g" event={"ID":"6d44f905-8c97-4b62-89e4-a3929a8a2042","Type":"ContainerStarted","Data":"59e34967bc80036bb597335aba32998347f1ccb617da5da2b6ca195c41f328b6"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.859956 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pdq5g" event={"ID":"6d44f905-8c97-4b62-89e4-a3929a8a2042","Type":"ContainerStarted","Data":"c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.862321 5004 generic.go:334] "Generic (PLEG): container finished" podID="fd567a61-b577-4119-8bd1-59039ffc45e8" containerID="342e97dddeda55b10502a5b49f567b58e80d1673f2cdffad657c2d2b0905228b" exitCode=0 Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.862382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wszb" event={"ID":"fd567a61-b577-4119-8bd1-59039ffc45e8","Type":"ContainerDied","Data":"342e97dddeda55b10502a5b49f567b58e80d1673f2cdffad657c2d2b0905228b"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.862406 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wszb" event={"ID":"fd567a61-b577-4119-8bd1-59039ffc45e8","Type":"ContainerStarted","Data":"3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.864853 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4524-account-create-update-hvf8r" event={"ID":"c9bf012d-617e-44d4-a5c3-6101921a5ece","Type":"ContainerStarted","Data":"b947b7d25ec0c5ca1d7a5ddac28aeaded751ac4c3d58a6ebc3eea09fc47c6b92"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.865008 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4524-account-create-update-hvf8r" event={"ID":"c9bf012d-617e-44d4-a5c3-6101921a5ece","Type":"ContainerStarted","Data":"117af6344d2e73da52a84958f3d788e34b3884b3350d185d8f7ee4d7f5e1c13a"} Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.894724 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-pdq5g" podStartSLOduration=1.894704526 podStartE2EDuration="1.894704526s" podCreationTimestamp="2025-12-03 14:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:25:55.889716773 +0000 UTC m=+1168.638686999" watchObservedRunningTime="2025-12-03 14:25:55.894704526 +0000 UTC m=+1168.643674772" Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.913805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-chkcv"] Dec 03 14:25:55 crc kubenswrapper[5004]: I1203 14:25:55.914432 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-4524-account-create-update-hvf8r" podStartSLOduration=1.9144087490000001 podStartE2EDuration="1.914408749s" podCreationTimestamp="2025-12-03 14:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:25:55.905634338 +0000 UTC m=+1168.654604584" watchObservedRunningTime="2025-12-03 14:25:55.914408749 +0000 UTC m=+1168.663378985" Dec 03 14:25:55 crc kubenswrapper[5004]: W1203 14:25:55.954250 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c5a19d_1c2f_42a4_b6d0_683c02a7af8b.slice/crio-efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430 WatchSource:0}: Error finding container efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430: Status 404 returned error can't find the container with id efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.445235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.873626 5004 generic.go:334] "Generic (PLEG): container finished" podID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerID="36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.873978 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerDied","Data":"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.876823 5004 generic.go:334] "Generic (PLEG): container finished" podID="c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" containerID="3ccaa28ad38a276cedacc6bddedd2c59654a79661c31faa4bfe4e328a8dac899" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.876911 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2c8b-account-create-update-85hzs" event={"ID":"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee","Type":"ContainerDied","Data":"3ccaa28ad38a276cedacc6bddedd2c59654a79661c31faa4bfe4e328a8dac899"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.876942 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2c8b-account-create-update-85hzs" event={"ID":"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee","Type":"ContainerStarted","Data":"929c0ca525c9abc00223d6b90f3908629eed961265ae95d7e79d8bad0842487f"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.880349 5004 generic.go:334] "Generic (PLEG): container finished" podID="c9bf012d-617e-44d4-a5c3-6101921a5ece" containerID="b947b7d25ec0c5ca1d7a5ddac28aeaded751ac4c3d58a6ebc3eea09fc47c6b92" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.880514 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4524-account-create-update-hvf8r" event={"ID":"c9bf012d-617e-44d4-a5c3-6101921a5ece","Type":"ContainerDied","Data":"b947b7d25ec0c5ca1d7a5ddac28aeaded751ac4c3d58a6ebc3eea09fc47c6b92"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.883645 5004 generic.go:334] "Generic (PLEG): container finished" podID="6d44f905-8c97-4b62-89e4-a3929a8a2042" containerID="59e34967bc80036bb597335aba32998347f1ccb617da5da2b6ca195c41f328b6" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.883729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pdq5g" event={"ID":"6d44f905-8c97-4b62-89e4-a3929a8a2042","Type":"ContainerDied","Data":"59e34967bc80036bb597335aba32998347f1ccb617da5da2b6ca195c41f328b6"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.886092 5004 generic.go:334] "Generic (PLEG): container finished" podID="e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" containerID="b84ec5ab5f7aa76f2683c06ad97380104713eaa7ec6767fd2db29e2aa4c86697" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.886159 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-chkcv" event={"ID":"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b","Type":"ContainerDied","Data":"b84ec5ab5f7aa76f2683c06ad97380104713eaa7ec6767fd2db29e2aa4c86697"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.886181 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-chkcv" event={"ID":"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b","Type":"ContainerStarted","Data":"efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430"} Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.888054 5004 generic.go:334] "Generic (PLEG): container finished" podID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerID="885deb67571b0f40c6fdcdd93d6440c32639996ce8f2cef0da52a94aa94e93d5" exitCode=0 Dec 03 14:25:56 crc kubenswrapper[5004]: I1203 14:25:56.888238 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerDied","Data":"885deb67571b0f40c6fdcdd93d6440c32639996ce8f2cef0da52a94aa94e93d5"} Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.239838 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.283221 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.343763 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.343905 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfsx\" (UniqueName: \"kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx\") pod \"fd567a61-b577-4119-8bd1-59039ffc45e8\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.343961 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rp2b\" (UniqueName: \"kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344313 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts\") pod \"fd567a61-b577-4119-8bd1-59039ffc45e8\" (UID: \"fd567a61-b577-4119-8bd1-59039ffc45e8\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344394 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344456 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344584 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344616 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344717 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf\") pod \"f9137320-4b52-422f-a96b-34c555c55aa6\" (UID: \"f9137320-4b52-422f-a96b-34c555c55aa6\") " Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344761 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd567a61-b577-4119-8bd1-59039ffc45e8" (UID: "fd567a61-b577-4119-8bd1-59039ffc45e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.344779 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.345412 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.345583 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd567a61-b577-4119-8bd1-59039ffc45e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.345601 5004 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.345610 5004 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f9137320-4b52-422f-a96b-34c555c55aa6-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.348750 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx" (OuterVolumeSpecName: "kube-api-access-htfsx") pod "fd567a61-b577-4119-8bd1-59039ffc45e8" (UID: "fd567a61-b577-4119-8bd1-59039ffc45e8"). InnerVolumeSpecName "kube-api-access-htfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.349330 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b" (OuterVolumeSpecName: "kube-api-access-4rp2b") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "kube-api-access-4rp2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.366064 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.379098 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.380796 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.385992 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts" (OuterVolumeSpecName: "scripts") pod "f9137320-4b52-422f-a96b-34c555c55aa6" (UID: "f9137320-4b52-422f-a96b-34c555c55aa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453341 5004 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453388 5004 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453401 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9137320-4b52-422f-a96b-34c555c55aa6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453415 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfsx\" (UniqueName: \"kubernetes.io/projected/fd567a61-b577-4119-8bd1-59039ffc45e8-kube-api-access-htfsx\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453431 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rp2b\" (UniqueName: \"kubernetes.io/projected/f9137320-4b52-422f-a96b-34c555c55aa6-kube-api-access-4rp2b\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.453442 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9137320-4b52-422f-a96b-34c555c55aa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.909899 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7xgl7" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.909901 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7xgl7" event={"ID":"f9137320-4b52-422f-a96b-34c555c55aa6","Type":"ContainerDied","Data":"5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0"} Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.910307 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0c20a870c882a104985aa6a62a1c448e40a2cce0706e009aa2dfebef868ce0" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.912567 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerStarted","Data":"ce016eb24268eeb5262d178cd7816a4e0120c4f2932fcdc39308002822a6a623"} Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.912757 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.914479 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerStarted","Data":"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba"} Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.914951 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.922776 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4wszb" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.923441 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4wszb" event={"ID":"fd567a61-b577-4119-8bd1-59039ffc45e8","Type":"ContainerDied","Data":"3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67"} Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.923479 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb18ad81e417362102e983b22f3a2e23faa53057a13387709e08d65bf08ee67" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.978080 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371978.876715 podStartE2EDuration="57.978061536s" podCreationTimestamp="2025-12-03 14:25:00 +0000 UTC" firstStartedPulling="2025-12-03 14:25:02.390887129 +0000 UTC m=+1115.139857365" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:25:57.937757304 +0000 UTC m=+1170.686727550" watchObservedRunningTime="2025-12-03 14:25:57.978061536 +0000 UTC m=+1170.727031762" Dec 03 14:25:57 crc kubenswrapper[5004]: I1203 14:25:57.991718 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.542804363 podStartE2EDuration="57.991692546s" podCreationTimestamp="2025-12-03 14:25:00 +0000 UTC" firstStartedPulling="2025-12-03 14:25:10.413493792 +0000 UTC m=+1123.162464028" lastFinishedPulling="2025-12-03 14:25:22.862381975 +0000 UTC m=+1135.611352211" observedRunningTime="2025-12-03 14:25:57.975164624 +0000 UTC m=+1170.724134880" watchObservedRunningTime="2025-12-03 14:25:57.991692546 +0000 UTC m=+1170.740662782" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.360409 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-chkcv" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.467965 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts\") pod \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.468136 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68p8\" (UniqueName: \"kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8\") pod \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\" (UID: \"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.468673 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" (UID: "e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.471670 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8" (OuterVolumeSpecName: "kube-api-access-d68p8") pod "e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" (UID: "e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b"). InnerVolumeSpecName "kube-api-access-d68p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.478500 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.485248 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.491674 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.569889 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrbb\" (UniqueName: \"kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb\") pod \"c9bf012d-617e-44d4-a5c3-6101921a5ece\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.569954 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9g52\" (UniqueName: \"kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52\") pod \"6d44f905-8c97-4b62-89e4-a3929a8a2042\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.569998 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8rtm\" (UniqueName: \"kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm\") pod \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570028 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts\") pod \"c9bf012d-617e-44d4-a5c3-6101921a5ece\" (UID: \"c9bf012d-617e-44d4-a5c3-6101921a5ece\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570091 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts\") pod \"6d44f905-8c97-4b62-89e4-a3929a8a2042\" (UID: \"6d44f905-8c97-4b62-89e4-a3929a8a2042\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570526 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68p8\" (UniqueName: \"kubernetes.io/projected/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-kube-api-access-d68p8\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570552 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570663 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d44f905-8c97-4b62-89e4-a3929a8a2042" (UID: "6d44f905-8c97-4b62-89e4-a3929a8a2042"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.570739 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9bf012d-617e-44d4-a5c3-6101921a5ece" (UID: "c9bf012d-617e-44d4-a5c3-6101921a5ece"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.573650 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm" (OuterVolumeSpecName: "kube-api-access-t8rtm") pod "c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" (UID: "c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee"). InnerVolumeSpecName "kube-api-access-t8rtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.574257 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52" (OuterVolumeSpecName: "kube-api-access-t9g52") pod "6d44f905-8c97-4b62-89e4-a3929a8a2042" (UID: "6d44f905-8c97-4b62-89e4-a3929a8a2042"). InnerVolumeSpecName "kube-api-access-t9g52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.574415 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb" (OuterVolumeSpecName: "kube-api-access-jkrbb") pod "c9bf012d-617e-44d4-a5c3-6101921a5ece" (UID: "c9bf012d-617e-44d4-a5c3-6101921a5ece"). InnerVolumeSpecName "kube-api-access-jkrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.671778 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts\") pod \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\" (UID: \"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee\") " Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.672508 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkrbb\" (UniqueName: \"kubernetes.io/projected/c9bf012d-617e-44d4-a5c3-6101921a5ece-kube-api-access-jkrbb\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.672541 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9g52\" (UniqueName: \"kubernetes.io/projected/6d44f905-8c97-4b62-89e4-a3929a8a2042-kube-api-access-t9g52\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.672558 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8rtm\" (UniqueName: \"kubernetes.io/projected/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-kube-api-access-t8rtm\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.672576 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf012d-617e-44d4-a5c3-6101921a5ece-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.672596 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d44f905-8c97-4b62-89e4-a3929a8a2042-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.673255 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" (UID: "c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.774312 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.933517 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2c8b-account-create-update-85hzs" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.933536 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2c8b-account-create-update-85hzs" event={"ID":"c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee","Type":"ContainerDied","Data":"929c0ca525c9abc00223d6b90f3908629eed961265ae95d7e79d8bad0842487f"} Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.933641 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929c0ca525c9abc00223d6b90f3908629eed961265ae95d7e79d8bad0842487f" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.936246 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4524-account-create-update-hvf8r" event={"ID":"c9bf012d-617e-44d4-a5c3-6101921a5ece","Type":"ContainerDied","Data":"117af6344d2e73da52a84958f3d788e34b3884b3350d185d8f7ee4d7f5e1c13a"} Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.936298 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="117af6344d2e73da52a84958f3d788e34b3884b3350d185d8f7ee4d7f5e1c13a" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.936341 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4524-account-create-update-hvf8r" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.938041 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pdq5g" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.938036 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pdq5g" event={"ID":"6d44f905-8c97-4b62-89e4-a3929a8a2042","Type":"ContainerDied","Data":"c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89"} Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.938154 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4eabe385807292f335a3c23896a00c8ef76661887de8f2ba6adbea364dc6d89" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.939503 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-chkcv" event={"ID":"e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b","Type":"ContainerDied","Data":"efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430"} Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.939534 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efbc0fa3f603eed43807b86171e7655cbf4c290ea249f9326d25fb1105514430" Dec 03 14:25:58 crc kubenswrapper[5004]: I1203 14:25:58.939654 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-chkcv" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.388365 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lnxkp"] Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389051 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389067 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389086 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d44f905-8c97-4b62-89e4-a3929a8a2042" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389093 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d44f905-8c97-4b62-89e4-a3929a8a2042" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389109 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd567a61-b577-4119-8bd1-59039ffc45e8" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389118 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd567a61-b577-4119-8bd1-59039ffc45e8" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389135 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9137320-4b52-422f-a96b-34c555c55aa6" containerName="swift-ring-rebalance" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389143 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9137320-4b52-422f-a96b-34c555c55aa6" containerName="swift-ring-rebalance" Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389159 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bf012d-617e-44d4-a5c3-6101921a5ece" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389166 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bf012d-617e-44d4-a5c3-6101921a5ece" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: E1203 14:26:00.389182 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389189 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389371 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389388 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9137320-4b52-422f-a96b-34c555c55aa6" containerName="swift-ring-rebalance" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389400 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bf012d-617e-44d4-a5c3-6101921a5ece" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389410 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" containerName="mariadb-account-create-update" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389423 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd567a61-b577-4119-8bd1-59039ffc45e8" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.389439 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d44f905-8c97-4b62-89e4-a3929a8a2042" containerName="mariadb-database-create" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.390103 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.392479 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jfgn9" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.392739 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.400798 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lnxkp"] Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.438018 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zdz2r" podUID="9cf66a90-3f7d-4170-8dab-9ff58ba576a3" containerName="ovn-controller" probeResult="failure" output=< Dec 03 14:26:00 crc kubenswrapper[5004]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 14:26:00 crc kubenswrapper[5004]: > Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.504032 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.504200 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.504239 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rmt\" (UniqueName: \"kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.504271 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.508068 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.605272 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.605350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rmt\" (UniqueName: \"kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.605396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.605474 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.620663 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.620692 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.621071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.627301 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rmt\" (UniqueName: \"kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt\") pod \"glance-db-sync-lnxkp\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:00 crc kubenswrapper[5004]: I1203 14:26:00.707790 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:01 crc kubenswrapper[5004]: I1203 14:26:01.240848 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lnxkp"] Dec 03 14:26:01 crc kubenswrapper[5004]: W1203 14:26:01.243946 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bed8f2_08f1_41f0_beb0_d0a2ded315bf.slice/crio-82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462 WatchSource:0}: Error finding container 82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462: Status 404 returned error can't find the container with id 82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462 Dec 03 14:26:01 crc kubenswrapper[5004]: I1203 14:26:01.971246 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lnxkp" event={"ID":"76bed8f2-08f1-41f0-beb0-d0a2ded315bf","Type":"ContainerStarted","Data":"82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462"} Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.439676 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zdz2r" podUID="9cf66a90-3f7d-4170-8dab-9ff58ba576a3" containerName="ovn-controller" probeResult="failure" output=< Dec 03 14:26:05 crc kubenswrapper[5004]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 14:26:05 crc kubenswrapper[5004]: > Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.467388 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-65kf4" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.739973 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zdz2r-config-dcd66"] Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.741031 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.747476 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.750306 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zdz2r-config-dcd66"] Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.935681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.935848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6qm6\" (UniqueName: \"kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.935898 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.935918 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.935957 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:05 crc kubenswrapper[5004]: I1203 14:26:05.936014 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037478 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037527 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037603 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037672 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6qm6\" (UniqueName: \"kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.037708 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.038048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.038086 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.038098 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.038966 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.047412 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.065936 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6qm6\" (UniqueName: \"kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6\") pod \"ovn-controller-zdz2r-config-dcd66\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:06 crc kubenswrapper[5004]: I1203 14:26:06.084712 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:10 crc kubenswrapper[5004]: I1203 14:26:10.012618 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:26:10 crc kubenswrapper[5004]: I1203 14:26:10.023008 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b45d92a5-2abb-421d-826f-185ac63f4661-etc-swift\") pod \"swift-storage-0\" (UID: \"b45d92a5-2abb-421d-826f-185ac63f4661\") " pod="openstack/swift-storage-0" Dec 03 14:26:10 crc kubenswrapper[5004]: I1203 14:26:10.248682 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 14:26:10 crc kubenswrapper[5004]: I1203 14:26:10.438312 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zdz2r" podUID="9cf66a90-3f7d-4170-8dab-9ff58ba576a3" containerName="ovn-controller" probeResult="failure" output=< Dec 03 14:26:10 crc kubenswrapper[5004]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 14:26:10 crc kubenswrapper[5004]: > Dec 03 14:26:11 crc kubenswrapper[5004]: I1203 14:26:11.670498 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.005516 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.165082 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zdz2r-config-dcd66"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.188220 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-fgpcz"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.190561 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.205656 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-fgpcz"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.314908 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-v49l9"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.315967 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.331070 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e791-account-create-update-bgrw9"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.334928 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.345687 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.358302 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzmv\" (UniqueName: \"kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.358350 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.377307 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e791-account-create-update-bgrw9"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.398104 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v49l9"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.437744 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.460324 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.463363 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzmv\" (UniqueName: \"kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.463428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.463477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.463515 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kztc\" (UniqueName: \"kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.463558 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.464433 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.481423 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzmv\" (UniqueName: \"kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv\") pod \"cinder-db-create-fgpcz\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.481486 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e16c-account-create-update-7d2pm"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.482589 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.493496 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.508436 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e16c-account-create-update-7d2pm"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.545581 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579503 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579633 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kztc\" (UniqueName: \"kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579658 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579817 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4tl7\" (UniqueName: \"kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.579881 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.581168 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.582265 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.582331 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-gq5tr"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.588706 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gq5tr"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.588823 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.597073 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.597149 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4rh8q" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.597316 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.597371 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.599839 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2dh4h"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.607771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.612197 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr\") pod \"barbican-e791-account-create-update-bgrw9\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.619418 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2dh4h"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.636923 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0e9a-account-create-update-z2skj"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.638162 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.640418 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.644250 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kztc\" (UniqueName: \"kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc\") pod \"barbican-db-create-v49l9\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.653735 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e9a-account-create-update-z2skj"] Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.663760 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681815 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4tl7\" (UniqueName: \"kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681888 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681943 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw8f9\" (UniqueName: \"kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httcg\" (UniqueName: \"kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.681995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.682046 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.683883 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.689184 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.701410 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4tl7\" (UniqueName: \"kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7\") pod \"cinder-e16c-account-create-update-7d2pm\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783557 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw8f9\" (UniqueName: \"kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783597 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg9xf\" (UniqueName: \"kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783652 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-httcg\" (UniqueName: \"kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783829 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783873 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.783993 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.785250 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.806380 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.806452 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-httcg\" (UniqueName: \"kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg\") pod \"neutron-db-create-2dh4h\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.808321 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.808929 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.811613 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw8f9\" (UniqueName: \"kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9\") pod \"keystone-db-sync-gq5tr\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.887367 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg9xf\" (UniqueName: \"kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.887764 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.888751 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:12 crc kubenswrapper[5004]: I1203 14:26:12.907282 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg9xf\" (UniqueName: \"kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf\") pod \"neutron-0e9a-account-create-update-z2skj\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.063325 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"216625c8b25b0c757d1add1bde2492a6be3d6085f10d8c1e11633c427769e301"} Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.068346 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lnxkp" event={"ID":"76bed8f2-08f1-41f0-beb0-d0a2ded315bf","Type":"ContainerStarted","Data":"40ef99f63c347aa5849e1574a9221ee9d757da0db9ea62343531ea6338240815"} Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.071286 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r-config-dcd66" event={"ID":"ca432af8-d59b-41ec-94e0-cbd73190ba4b","Type":"ContainerStarted","Data":"1904921655fa97ba0791e953363ab31d01b4a4bb4ed587fb06b0780a8b86782b"} Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.071455 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r-config-dcd66" event={"ID":"ca432af8-d59b-41ec-94e0-cbd73190ba4b","Type":"ContainerStarted","Data":"39ff90dae9bf841274d64097a01dba2101d460d1329fb5e1f1195f1ba2c9cca8"} Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.095956 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.101649 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lnxkp" podStartSLOduration=2.361401569 podStartE2EDuration="13.101630497s" podCreationTimestamp="2025-12-03 14:26:00 +0000 UTC" firstStartedPulling="2025-12-03 14:26:01.250089209 +0000 UTC m=+1173.999059445" lastFinishedPulling="2025-12-03 14:26:11.990318137 +0000 UTC m=+1184.739288373" observedRunningTime="2025-12-03 14:26:13.096531181 +0000 UTC m=+1185.845501417" watchObservedRunningTime="2025-12-03 14:26:13.101630497 +0000 UTC m=+1185.850600733" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.105545 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-fgpcz"] Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.105941 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.123206 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zdz2r-config-dcd66" podStartSLOduration=8.123187113 podStartE2EDuration="8.123187113s" podCreationTimestamp="2025-12-03 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:13.120077335 +0000 UTC m=+1185.869047561" watchObservedRunningTime="2025-12-03 14:26:13.123187113 +0000 UTC m=+1185.872157349" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.128110 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.291674 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e791-account-create-update-bgrw9"] Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.358584 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v49l9"] Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.408682 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e16c-account-create-update-7d2pm"] Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.746970 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2dh4h"] Dec 03 14:26:13 crc kubenswrapper[5004]: W1203 14:26:13.765982 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22aca07_9901_4532_8e36_e4ef14be0a26.slice/crio-a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a WatchSource:0}: Error finding container a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a: Status 404 returned error can't find the container with id a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.830948 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gq5tr"] Dec 03 14:26:13 crc kubenswrapper[5004]: W1203 14:26:13.841646 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa5e91c_186c_419f_b6a3_95c486ff267d.slice/crio-44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1 WatchSource:0}: Error finding container 44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1: Status 404 returned error can't find the container with id 44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1 Dec 03 14:26:13 crc kubenswrapper[5004]: I1203 14:26:13.870186 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e9a-account-create-update-z2skj"] Dec 03 14:26:13 crc kubenswrapper[5004]: W1203 14:26:13.895958 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de6f3fa_11a3_4730_8424_47207b77ca2d.slice/crio-7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8 WatchSource:0}: Error finding container 7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8: Status 404 returned error can't find the container with id 7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.081314 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2dh4h" event={"ID":"f22aca07-9901-4532-8e36-e4ef14be0a26","Type":"ContainerStarted","Data":"1e78886f451ae9e3e253ae331d091b6ff038e999da4076a143dfea51c324dc76"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.081532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2dh4h" event={"ID":"f22aca07-9901-4532-8e36-e4ef14be0a26","Type":"ContainerStarted","Data":"a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.082720 5004 generic.go:334] "Generic (PLEG): container finished" podID="811a96c5-1501-47c0-a372-702a55e5182f" containerID="360e87564925cc1ed866f6a965f4c597ad769bf2dc578c168aae01d9c3c8406b" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.082792 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v49l9" event={"ID":"811a96c5-1501-47c0-a372-702a55e5182f","Type":"ContainerDied","Data":"360e87564925cc1ed866f6a965f4c597ad769bf2dc578c168aae01d9c3c8406b"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.082915 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v49l9" event={"ID":"811a96c5-1501-47c0-a372-702a55e5182f","Type":"ContainerStarted","Data":"987fdfb7a9d28a4f8c4331833c4106bcc753b4bd441bd4b9265eb006ac1be24e"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.084050 5004 generic.go:334] "Generic (PLEG): container finished" podID="24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" containerID="18f383dbc9f667005300b9f4adfb832d43f150cfc1d8c5b5e7f6775b0632d249" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.084107 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e791-account-create-update-bgrw9" event={"ID":"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d","Type":"ContainerDied","Data":"18f383dbc9f667005300b9f4adfb832d43f150cfc1d8c5b5e7f6775b0632d249"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.084128 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e791-account-create-update-bgrw9" event={"ID":"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d","Type":"ContainerStarted","Data":"b98e0005ad7aa3f64c035501c4c4d48a421df318631534b2a8c0ff7ae85a9394"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.085923 5004 generic.go:334] "Generic (PLEG): container finished" podID="ca432af8-d59b-41ec-94e0-cbd73190ba4b" containerID="1904921655fa97ba0791e953363ab31d01b4a4bb4ed587fb06b0780a8b86782b" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.085977 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r-config-dcd66" event={"ID":"ca432af8-d59b-41ec-94e0-cbd73190ba4b","Type":"ContainerDied","Data":"1904921655fa97ba0791e953363ab31d01b4a4bb4ed587fb06b0780a8b86782b"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.087209 5004 generic.go:334] "Generic (PLEG): container finished" podID="8590fe31-fb29-40a9-b61e-569709bf9008" containerID="a582e484dbb2a3817dbe16f7da8e11ec2bb058999140ca903ce779db3b968118" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.087258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e16c-account-create-update-7d2pm" event={"ID":"8590fe31-fb29-40a9-b61e-569709bf9008","Type":"ContainerDied","Data":"a582e484dbb2a3817dbe16f7da8e11ec2bb058999140ca903ce779db3b968118"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.087277 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e16c-account-create-update-7d2pm" event={"ID":"8590fe31-fb29-40a9-b61e-569709bf9008","Type":"ContainerStarted","Data":"178d59685450b0ce9c5dc13f817cbe249c643bf89c40c2bb58e8af7b99ded830"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.088694 5004 generic.go:334] "Generic (PLEG): container finished" podID="08a45345-f85d-4134-87d4-70377be8f7cf" containerID="37d1103f002984f67aaf6772b4a0a58a8d130bfc087a7955ec35122714950641" exitCode=0 Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.088744 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fgpcz" event={"ID":"08a45345-f85d-4134-87d4-70377be8f7cf","Type":"ContainerDied","Data":"37d1103f002984f67aaf6772b4a0a58a8d130bfc087a7955ec35122714950641"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.088765 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fgpcz" event={"ID":"08a45345-f85d-4134-87d4-70377be8f7cf","Type":"ContainerStarted","Data":"317d746e52be5525c8f2930c913016bea31d2c27991ef87b84686d2764b80c5e"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.089935 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9a-account-create-update-z2skj" event={"ID":"8de6f3fa-11a3-4730-8424-47207b77ca2d","Type":"ContainerStarted","Data":"7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.092209 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gq5tr" event={"ID":"7aa5e91c-186c-419f-b6a3-95c486ff267d","Type":"ContainerStarted","Data":"44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1"} Dec 03 14:26:14 crc kubenswrapper[5004]: I1203 14:26:14.105510 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2dh4h" podStartSLOduration=2.105491586 podStartE2EDuration="2.105491586s" podCreationTimestamp="2025-12-03 14:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:14.105363333 +0000 UTC m=+1186.854333569" watchObservedRunningTime="2025-12-03 14:26:14.105491586 +0000 UTC m=+1186.854461822" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.102100 5004 generic.go:334] "Generic (PLEG): container finished" podID="8de6f3fa-11a3-4730-8424-47207b77ca2d" containerID="e1ec5dc0002d8c342388f403b03d9bfbf915d1efa0335bddc020897fa20bf841" exitCode=0 Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.102290 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9a-account-create-update-z2skj" event={"ID":"8de6f3fa-11a3-4730-8424-47207b77ca2d","Type":"ContainerDied","Data":"e1ec5dc0002d8c342388f403b03d9bfbf915d1efa0335bddc020897fa20bf841"} Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.106482 5004 generic.go:334] "Generic (PLEG): container finished" podID="f22aca07-9901-4532-8e36-e4ef14be0a26" containerID="1e78886f451ae9e3e253ae331d091b6ff038e999da4076a143dfea51c324dc76" exitCode=0 Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.106601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2dh4h" event={"ID":"f22aca07-9901-4532-8e36-e4ef14be0a26","Type":"ContainerDied","Data":"1e78886f451ae9e3e253ae331d091b6ff038e999da4076a143dfea51c324dc76"} Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.108751 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"fa9f707f125eb48654a34cc9a3d955565ef7b5b1895d68c30149749385502cf9"} Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.108793 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"3dcbace965ec80c8ad11fc8bc87f0e28990ce142432d3fed20077a4af1354cd5"} Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.425880 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.457469 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zdz2r" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.559904 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts\") pod \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.560331 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr\") pod \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\" (UID: \"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.561119 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" (UID: "24f6fb8e-6887-4ca5-8fc6-3c44db29d84d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.569389 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr" (OuterVolumeSpecName: "kube-api-access-xkrvr") pod "24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" (UID: "24f6fb8e-6887-4ca5-8fc6-3c44db29d84d"). InnerVolumeSpecName "kube-api-access-xkrvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.663659 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.663690 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d-kube-api-access-xkrvr\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.746226 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.758404 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.777532 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kztc\" (UniqueName: \"kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc\") pod \"811a96c5-1501-47c0-a372-702a55e5182f\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.782982 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.808073 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc" (OuterVolumeSpecName: "kube-api-access-8kztc") pod "811a96c5-1501-47c0-a372-702a55e5182f" (UID: "811a96c5-1501-47c0-a372-702a55e5182f"). InnerVolumeSpecName "kube-api-access-8kztc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.810985 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.881234 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts\") pod \"811a96c5-1501-47c0-a372-702a55e5182f\" (UID: \"811a96c5-1501-47c0-a372-702a55e5182f\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.881810 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.881840 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882016 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882009 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882046 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts\") pod \"08a45345-f85d-4134-87d4-70377be8f7cf\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882078 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run" (OuterVolumeSpecName: "var-run") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882092 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts\") pod \"8590fe31-fb29-40a9-b61e-569709bf9008\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882146 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882173 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4tl7\" (UniqueName: \"kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7\") pod \"8590fe31-fb29-40a9-b61e-569709bf9008\" (UID: \"8590fe31-fb29-40a9-b61e-569709bf9008\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882277 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882464 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08a45345-f85d-4134-87d4-70377be8f7cf" (UID: "08a45345-f85d-4134-87d4-70377be8f7cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882606 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8590fe31-fb29-40a9-b61e-569709bf9008" (UID: "8590fe31-fb29-40a9-b61e-569709bf9008"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882286 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6qm6\" (UniqueName: \"kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882791 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "811a96c5-1501-47c0-a372-702a55e5182f" (UID: "811a96c5-1501-47c0-a372-702a55e5182f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.882842 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qzmv\" (UniqueName: \"kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv\") pod \"08a45345-f85d-4134-87d4-70377be8f7cf\" (UID: \"08a45345-f85d-4134-87d4-70377be8f7cf\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883169 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts\") pod \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\" (UID: \"ca432af8-d59b-41ec-94e0-cbd73190ba4b\") " Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883680 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts" (OuterVolumeSpecName: "scripts") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883706 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/811a96c5-1501-47c0-a372-702a55e5182f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883728 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kztc\" (UniqueName: \"kubernetes.io/projected/811a96c5-1501-47c0-a372-702a55e5182f-kube-api-access-8kztc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883743 5004 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883754 5004 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883766 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a45345-f85d-4134-87d4-70377be8f7cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883778 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8590fe31-fb29-40a9-b61e-569709bf9008-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.883791 5004 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca432af8-d59b-41ec-94e0-cbd73190ba4b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.884107 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.885393 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7" (OuterVolumeSpecName: "kube-api-access-p4tl7") pod "8590fe31-fb29-40a9-b61e-569709bf9008" (UID: "8590fe31-fb29-40a9-b61e-569709bf9008"). InnerVolumeSpecName "kube-api-access-p4tl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.886108 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6" (OuterVolumeSpecName: "kube-api-access-v6qm6") pod "ca432af8-d59b-41ec-94e0-cbd73190ba4b" (UID: "ca432af8-d59b-41ec-94e0-cbd73190ba4b"). InnerVolumeSpecName "kube-api-access-v6qm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.886510 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv" (OuterVolumeSpecName: "kube-api-access-9qzmv") pod "08a45345-f85d-4134-87d4-70377be8f7cf" (UID: "08a45345-f85d-4134-87d4-70377be8f7cf"). InnerVolumeSpecName "kube-api-access-9qzmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.988447 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4tl7\" (UniqueName: \"kubernetes.io/projected/8590fe31-fb29-40a9-b61e-569709bf9008-kube-api-access-p4tl7\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.988530 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6qm6\" (UniqueName: \"kubernetes.io/projected/ca432af8-d59b-41ec-94e0-cbd73190ba4b-kube-api-access-v6qm6\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.988543 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qzmv\" (UniqueName: \"kubernetes.io/projected/08a45345-f85d-4134-87d4-70377be8f7cf-kube-api-access-9qzmv\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.988556 5004 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:15 crc kubenswrapper[5004]: I1203 14:26:15.988572 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca432af8-d59b-41ec-94e0-cbd73190ba4b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.123502 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zdz2r-config-dcd66" event={"ID":"ca432af8-d59b-41ec-94e0-cbd73190ba4b","Type":"ContainerDied","Data":"39ff90dae9bf841274d64097a01dba2101d460d1329fb5e1f1195f1ba2c9cca8"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.123544 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ff90dae9bf841274d64097a01dba2101d460d1329fb5e1f1195f1ba2c9cca8" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.123600 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zdz2r-config-dcd66" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.126744 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e16c-account-create-update-7d2pm" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.126836 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e16c-account-create-update-7d2pm" event={"ID":"8590fe31-fb29-40a9-b61e-569709bf9008","Type":"ContainerDied","Data":"178d59685450b0ce9c5dc13f817cbe249c643bf89c40c2bb58e8af7b99ded830"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.126999 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="178d59685450b0ce9c5dc13f817cbe249c643bf89c40c2bb58e8af7b99ded830" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.130922 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"0bd97ef924cd258f2ce17aebf67ee6c9ac4638b1ced1c106e9911d275791c13a"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.131015 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"948f20e114c92a220a31909c63855471bdb57234436836e27fd32350ecb204ed"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.133732 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fgpcz" event={"ID":"08a45345-f85d-4134-87d4-70377be8f7cf","Type":"ContainerDied","Data":"317d746e52be5525c8f2930c913016bea31d2c27991ef87b84686d2764b80c5e"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.133767 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fgpcz" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.133781 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="317d746e52be5525c8f2930c913016bea31d2c27991ef87b84686d2764b80c5e" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.136755 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v49l9" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.136761 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v49l9" event={"ID":"811a96c5-1501-47c0-a372-702a55e5182f","Type":"ContainerDied","Data":"987fdfb7a9d28a4f8c4331833c4106bcc753b4bd441bd4b9265eb006ac1be24e"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.136887 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987fdfb7a9d28a4f8c4331833c4106bcc753b4bd441bd4b9265eb006ac1be24e" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.139168 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e791-account-create-update-bgrw9" event={"ID":"24f6fb8e-6887-4ca5-8fc6-3c44db29d84d","Type":"ContainerDied","Data":"b98e0005ad7aa3f64c035501c4c4d48a421df318631534b2a8c0ff7ae85a9394"} Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.139222 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b98e0005ad7aa3f64c035501c4c4d48a421df318631534b2a8c0ff7ae85a9394" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.139376 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e791-account-create-update-bgrw9" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.318997 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zdz2r-config-dcd66"] Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.329830 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zdz2r-config-dcd66"] Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.792065 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.798783 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.819566 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts\") pod \"8de6f3fa-11a3-4730-8424-47207b77ca2d\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.819721 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg9xf\" (UniqueName: \"kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf\") pod \"8de6f3fa-11a3-4730-8424-47207b77ca2d\" (UID: \"8de6f3fa-11a3-4730-8424-47207b77ca2d\") " Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.820777 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8de6f3fa-11a3-4730-8424-47207b77ca2d" (UID: "8de6f3fa-11a3-4730-8424-47207b77ca2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.830096 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf" (OuterVolumeSpecName: "kube-api-access-tg9xf") pod "8de6f3fa-11a3-4730-8424-47207b77ca2d" (UID: "8de6f3fa-11a3-4730-8424-47207b77ca2d"). InnerVolumeSpecName "kube-api-access-tg9xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.921324 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-httcg\" (UniqueName: \"kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg\") pod \"f22aca07-9901-4532-8e36-e4ef14be0a26\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.921550 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts\") pod \"f22aca07-9901-4532-8e36-e4ef14be0a26\" (UID: \"f22aca07-9901-4532-8e36-e4ef14be0a26\") " Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.922221 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8de6f3fa-11a3-4730-8424-47207b77ca2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.922249 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg9xf\" (UniqueName: \"kubernetes.io/projected/8de6f3fa-11a3-4730-8424-47207b77ca2d-kube-api-access-tg9xf\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.922244 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f22aca07-9901-4532-8e36-e4ef14be0a26" (UID: "f22aca07-9901-4532-8e36-e4ef14be0a26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:16 crc kubenswrapper[5004]: I1203 14:26:16.925207 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg" (OuterVolumeSpecName: "kube-api-access-httcg") pod "f22aca07-9901-4532-8e36-e4ef14be0a26" (UID: "f22aca07-9901-4532-8e36-e4ef14be0a26"). InnerVolumeSpecName "kube-api-access-httcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.024096 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22aca07-9901-4532-8e36-e4ef14be0a26-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.024125 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-httcg\" (UniqueName: \"kubernetes.io/projected/f22aca07-9901-4532-8e36-e4ef14be0a26-kube-api-access-httcg\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.151915 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e9a-account-create-update-z2skj" event={"ID":"8de6f3fa-11a3-4730-8424-47207b77ca2d","Type":"ContainerDied","Data":"7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8"} Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.151937 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e9a-account-create-update-z2skj" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.151955 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7822de19b90d3acf9e1630e62de16838b6bb4681c395cc5e76e33df35654dfc8" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.156426 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2dh4h" event={"ID":"f22aca07-9901-4532-8e36-e4ef14be0a26","Type":"ContainerDied","Data":"a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a"} Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.156477 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a35e1f7a35efcd762d377ad668971f52608176399c24234283aaf4b7a73a5e0a" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.156451 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2dh4h" Dec 03 14:26:17 crc kubenswrapper[5004]: I1203 14:26:17.637820 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca432af8-d59b-41ec-94e0-cbd73190ba4b" path="/var/lib/kubelet/pods/ca432af8-d59b-41ec-94e0-cbd73190ba4b/volumes" Dec 03 14:26:20 crc kubenswrapper[5004]: I1203 14:26:20.184009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"b4fcdd3514a01711c4ee8be404f1f1c105aa4ffd748c5f360ad70b2a79cd69d4"} Dec 03 14:26:20 crc kubenswrapper[5004]: I1203 14:26:20.184716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"7586e83c75346868e3604276e447694fc439eb07ab9e8680ce7d7d2042792ecb"} Dec 03 14:26:20 crc kubenswrapper[5004]: I1203 14:26:20.186513 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gq5tr" event={"ID":"7aa5e91c-186c-419f-b6a3-95c486ff267d","Type":"ContainerStarted","Data":"db43822ea38c33ddc82bfbab2e0e49d58b62465f107c9874c0574f913c623746"} Dec 03 14:26:20 crc kubenswrapper[5004]: I1203 14:26:20.214416 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-gq5tr" podStartSLOduration=2.784538858 podStartE2EDuration="8.214400061s" podCreationTimestamp="2025-12-03 14:26:12 +0000 UTC" firstStartedPulling="2025-12-03 14:26:13.844821314 +0000 UTC m=+1186.593791550" lastFinishedPulling="2025-12-03 14:26:19.274682507 +0000 UTC m=+1192.023652753" observedRunningTime="2025-12-03 14:26:20.207214276 +0000 UTC m=+1192.956184522" watchObservedRunningTime="2025-12-03 14:26:20.214400061 +0000 UTC m=+1192.963370287" Dec 03 14:26:21 crc kubenswrapper[5004]: I1203 14:26:21.196350 5004 generic.go:334] "Generic (PLEG): container finished" podID="76bed8f2-08f1-41f0-beb0-d0a2ded315bf" containerID="40ef99f63c347aa5849e1574a9221ee9d757da0db9ea62343531ea6338240815" exitCode=0 Dec 03 14:26:21 crc kubenswrapper[5004]: I1203 14:26:21.196447 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lnxkp" event={"ID":"76bed8f2-08f1-41f0-beb0-d0a2ded315bf","Type":"ContainerDied","Data":"40ef99f63c347aa5849e1574a9221ee9d757da0db9ea62343531ea6338240815"} Dec 03 14:26:21 crc kubenswrapper[5004]: I1203 14:26:21.202089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"77610fa77edf366b4683d174b0791d3e451c23a735c5e3a310427313bfbe2be0"} Dec 03 14:26:21 crc kubenswrapper[5004]: I1203 14:26:21.202238 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"6cca84bbd465492fd8c4af7518bdd543c4a5d01acbfb077776130d448da9335a"} Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.227281 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"4e41c6aad1272ec49e6a5d0ca802a7180f01cbc54d5f7351cee8397bb062e8ed"} Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.227637 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"ed07928efed8e7e1455fac1998f8b75b35d4c02703e9b35b221367a5e47d445c"} Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.227650 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"c55d76d98b5815d59fa8825c5c8a980e84df7f14d5d2f56d5a49c580515b8fff"} Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.641463 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.760489 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle\") pod \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.760622 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data\") pod \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.760649 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7rmt\" (UniqueName: \"kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt\") pod \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.760685 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data\") pod \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\" (UID: \"76bed8f2-08f1-41f0-beb0-d0a2ded315bf\") " Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.775671 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "76bed8f2-08f1-41f0-beb0-d0a2ded315bf" (UID: "76bed8f2-08f1-41f0-beb0-d0a2ded315bf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.781138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt" (OuterVolumeSpecName: "kube-api-access-r7rmt") pod "76bed8f2-08f1-41f0-beb0-d0a2ded315bf" (UID: "76bed8f2-08f1-41f0-beb0-d0a2ded315bf"). InnerVolumeSpecName "kube-api-access-r7rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.793631 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76bed8f2-08f1-41f0-beb0-d0a2ded315bf" (UID: "76bed8f2-08f1-41f0-beb0-d0a2ded315bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.820831 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data" (OuterVolumeSpecName: "config-data") pod "76bed8f2-08f1-41f0-beb0-d0a2ded315bf" (UID: "76bed8f2-08f1-41f0-beb0-d0a2ded315bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.824479 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.824555 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.862701 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.862736 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7rmt\" (UniqueName: \"kubernetes.io/projected/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-kube-api-access-r7rmt\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.862747 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:22 crc kubenswrapper[5004]: I1203 14:26:22.862757 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bed8f2-08f1-41f0-beb0-d0a2ded315bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.250304 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"097de745a379df08445d938ca0f5c7f3522a74971a00bbe4ebcb64d911d280be"} Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.250651 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"dee1025c85cc355da4285abfd3dfddcf354e9105e79f82c190ca69c82825c996"} Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.252472 5004 generic.go:334] "Generic (PLEG): container finished" podID="7aa5e91c-186c-419f-b6a3-95c486ff267d" containerID="db43822ea38c33ddc82bfbab2e0e49d58b62465f107c9874c0574f913c623746" exitCode=0 Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.252529 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gq5tr" event={"ID":"7aa5e91c-186c-419f-b6a3-95c486ff267d","Type":"ContainerDied","Data":"db43822ea38c33ddc82bfbab2e0e49d58b62465f107c9874c0574f913c623746"} Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.254595 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lnxkp" event={"ID":"76bed8f2-08f1-41f0-beb0-d0a2ded315bf","Type":"ContainerDied","Data":"82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462"} Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.254626 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e2dc2e810eb841228bb29c2184fdafe625db1624fa93c09d5e30f5836ee462" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.254689 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lnxkp" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.450910 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bed8f2_08f1_41f0_beb0_d0a2ded315bf.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516256 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516637 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22aca07-9901-4532-8e36-e4ef14be0a26" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516659 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22aca07-9901-4532-8e36-e4ef14be0a26" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516675 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811a96c5-1501-47c0-a372-702a55e5182f" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516683 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="811a96c5-1501-47c0-a372-702a55e5182f" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516693 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516701 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516717 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a45345-f85d-4134-87d4-70377be8f7cf" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516725 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a45345-f85d-4134-87d4-70377be8f7cf" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516737 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de6f3fa-11a3-4730-8424-47207b77ca2d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516744 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de6f3fa-11a3-4730-8424-47207b77ca2d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516766 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca432af8-d59b-41ec-94e0-cbd73190ba4b" containerName="ovn-config" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516774 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca432af8-d59b-41ec-94e0-cbd73190ba4b" containerName="ovn-config" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516790 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bed8f2-08f1-41f0-beb0-d0a2ded315bf" containerName="glance-db-sync" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516797 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bed8f2-08f1-41f0-beb0-d0a2ded315bf" containerName="glance-db-sync" Dec 03 14:26:23 crc kubenswrapper[5004]: E1203 14:26:23.516815 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8590fe31-fb29-40a9-b61e-569709bf9008" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.516823 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8590fe31-fb29-40a9-b61e-569709bf9008" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517050 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="811a96c5-1501-47c0-a372-702a55e5182f" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517089 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca432af8-d59b-41ec-94e0-cbd73190ba4b" containerName="ovn-config" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517106 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bed8f2-08f1-41f0-beb0-d0a2ded315bf" containerName="glance-db-sync" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517122 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de6f3fa-11a3-4730-8424-47207b77ca2d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517138 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8590fe31-fb29-40a9-b61e-569709bf9008" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517150 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22aca07-9901-4532-8e36-e4ef14be0a26" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517165 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" containerName="mariadb-account-create-update" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.517181 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a45345-f85d-4134-87d4-70377be8f7cf" containerName="mariadb-database-create" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.520837 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.537599 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.675600 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.675755 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.675894 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflpr\" (UniqueName: \"kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.675945 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.676132 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.778051 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.778340 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.778375 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.778408 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflpr\" (UniqueName: \"kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.778427 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.779068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.779081 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.779606 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.779695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.797703 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflpr\" (UniqueName: \"kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr\") pod \"dnsmasq-dns-5b946c75cc-ft85p\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:23 crc kubenswrapper[5004]: I1203 14:26:23.845018 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.273559 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"c68cd7c3bc5b1ae6476e1dbeb1aa471d68b8af500d2b27a1f2735a7fba21f1b3"} Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.273629 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b45d92a5-2abb-421d-826f-185ac63f4661","Type":"ContainerStarted","Data":"218b82c3da840ac158059f8a68740cc6a812934606ba819ce335e1c128031da7"} Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.308390 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.326280 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.172750348 podStartE2EDuration="47.326256394s" podCreationTimestamp="2025-12-03 14:25:37 +0000 UTC" firstStartedPulling="2025-12-03 14:26:12.438990184 +0000 UTC m=+1185.187960420" lastFinishedPulling="2025-12-03 14:26:21.59249621 +0000 UTC m=+1194.341466466" observedRunningTime="2025-12-03 14:26:24.315792435 +0000 UTC m=+1197.064762671" watchObservedRunningTime="2025-12-03 14:26:24.326256394 +0000 UTC m=+1197.075226630" Dec 03 14:26:24 crc kubenswrapper[5004]: W1203 14:26:24.337223 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda095b655_0c15_4eb2_96e2_ee59c4d47ce8.slice/crio-8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee WatchSource:0}: Error finding container 8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee: Status 404 returned error can't find the container with id 8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.590068 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.609280 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.643038 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:24 crc kubenswrapper[5004]: E1203 14:26:24.643693 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa5e91c-186c-419f-b6a3-95c486ff267d" containerName="keystone-db-sync" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.643711 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa5e91c-186c-419f-b6a3-95c486ff267d" containerName="keystone-db-sync" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.643983 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa5e91c-186c-419f-b6a3-95c486ff267d" containerName="keystone-db-sync" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.644994 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.647270 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.653705 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697308 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw8f9\" (UniqueName: \"kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9\") pod \"7aa5e91c-186c-419f-b6a3-95c486ff267d\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697389 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data\") pod \"7aa5e91c-186c-419f-b6a3-95c486ff267d\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697447 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle\") pod \"7aa5e91c-186c-419f-b6a3-95c486ff267d\" (UID: \"7aa5e91c-186c-419f-b6a3-95c486ff267d\") " Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697686 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697713 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmvv\" (UniqueName: \"kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697768 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697838 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697907 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.697933 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.709122 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9" (OuterVolumeSpecName: "kube-api-access-nw8f9") pod "7aa5e91c-186c-419f-b6a3-95c486ff267d" (UID: "7aa5e91c-186c-419f-b6a3-95c486ff267d"). InnerVolumeSpecName "kube-api-access-nw8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.730772 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aa5e91c-186c-419f-b6a3-95c486ff267d" (UID: "7aa5e91c-186c-419f-b6a3-95c486ff267d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.745525 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data" (OuterVolumeSpecName: "config-data") pod "7aa5e91c-186c-419f-b6a3-95c486ff267d" (UID: "7aa5e91c-186c-419f-b6a3-95c486ff267d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.799713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.799998 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmvv\" (UniqueName: \"kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800032 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800071 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800130 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800177 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw8f9\" (UniqueName: \"kubernetes.io/projected/7aa5e91c-186c-419f-b6a3-95c486ff267d-kube-api-access-nw8f9\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800188 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800198 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa5e91c-186c-419f-b6a3-95c486ff267d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.800814 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.801013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.801075 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.801997 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.803086 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:24 crc kubenswrapper[5004]: I1203 14:26:24.820335 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmvv\" (UniqueName: \"kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv\") pod \"dnsmasq-dns-74f6bcbc87-lndlt\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.011624 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.298501 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gq5tr" event={"ID":"7aa5e91c-186c-419f-b6a3-95c486ff267d","Type":"ContainerDied","Data":"44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1"} Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.298808 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c892c90d500c240e7f55084930b0a8a4ef139aa968dac96596a1aef85be2c1" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.298917 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gq5tr" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.319541 5004 generic.go:334] "Generic (PLEG): container finished" podID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" containerID="41cbd00adf83b7dde705d5f17977d78a20fa97a69245fb7841b59cd9079675e0" exitCode=0 Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.319653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" event={"ID":"a095b655-0c15-4eb2-96e2-ee59c4d47ce8","Type":"ContainerDied","Data":"41cbd00adf83b7dde705d5f17977d78a20fa97a69245fb7841b59cd9079675e0"} Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.319710 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" event={"ID":"a095b655-0c15-4eb2-96e2-ee59c4d47ce8","Type":"ContainerStarted","Data":"8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee"} Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.465515 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x52kd"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.466835 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.475605 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.475850 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4rh8q" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.476013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.476146 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.476279 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.512684 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515556 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515601 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515650 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515685 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9pww\" (UniqueName: \"kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515712 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.515777 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.534300 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x52kd"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.576099 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.577876 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.616692 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.616970 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.617089 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whdpc\" (UniqueName: \"kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.617643 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.617747 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.617904 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618052 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618221 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9pww\" (UniqueName: \"kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618744 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.618895 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.637785 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.647290 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.647980 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.649515 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.657990 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.658969 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.673539 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.678344 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9pww\" (UniqueName: \"kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww\") pod \"keystone-bootstrap-x52kd\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.706603 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-c9xfn"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.717657 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-c9xfn"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.717784 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.722546 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xt67s" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723069 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723320 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723358 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723427 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723471 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whdpc\" (UniqueName: \"kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.723532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.725261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.734164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.736544 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.741009 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.743225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.744040 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.748363 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mj68n" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.748576 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.748744 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.748988 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.750363 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.751231 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.774110 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.784661 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whdpc\" (UniqueName: \"kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc\") pod \"dnsmasq-dns-847c4cc679-gkdfh\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.813618 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.815975 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wv94f"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.818053 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.827669 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.827840 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.827959 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.828041 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.828153 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmts\" (UniqueName: \"kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.828258 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.828354 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.828431 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.852448 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kzb2h" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.852634 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.852726 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.860085 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wv94f"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.917609 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.937990 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.939448 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944034 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgqh\" (UniqueName: \"kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944124 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944270 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944369 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944409 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944430 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944499 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944535 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.944554 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.945056 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.945173 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmts\" (UniqueName: \"kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.945213 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.945447 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.945948 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.948754 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.949418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.952920 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-646bh"] Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.954131 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.967380 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n8ncc" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.967641 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.969527 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:25 crc kubenswrapper[5004]: I1203 14:26:25.992492 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-646bh"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.015134 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmts\" (UniqueName: \"kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts\") pod \"neutron-db-sync-c9xfn\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.018548 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.024669 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f\") pod \"horizon-8645b8f4d9-4xc7j\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046251 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgqh\" (UniqueName: \"kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046304 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046353 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749gx\" (UniqueName: \"kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046376 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6xs\" (UniqueName: \"kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046407 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046432 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046478 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046503 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046530 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046552 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046580 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.046616 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.047994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.066056 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.066146 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.068891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.087619 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.089618 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.118385 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgqh\" (UniqueName: \"kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.120186 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data\") pod \"cinder-db-sync-wv94f\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.132524 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.146395 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156404 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156456 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156508 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156557 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749gx\" (UniqueName: \"kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156584 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6xs\" (UniqueName: \"kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156653 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156716 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.156746 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.158204 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.158426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.160598 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.161618 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.165563 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.166072 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jfgn9" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.166365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.169108 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.169555 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.177964 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.181899 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.186734 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.194630 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.195718 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.208915 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.221636 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6xs\" (UniqueName: \"kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs\") pod \"barbican-db-sync-646bh\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: E1203 14:26:26.245025 5004 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 03 14:26:26 crc kubenswrapper[5004]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/a095b655-0c15-4eb2-96e2-ee59c4d47ce8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 14:26:26 crc kubenswrapper[5004]: > podSandboxID="8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee" Dec 03 14:26:26 crc kubenswrapper[5004]: E1203 14:26:26.245234 5004 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 14:26:26 crc kubenswrapper[5004]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c5h88h579h579h64bhch5b8h98h558h5bdh679hbdh559h64bh644h656hb9h5dh654h5ch597h55ch5cdh689h57ch54dh696h655h64h58h55fhcq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mflpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5b946c75cc-ft85p_openstack(a095b655-0c15-4eb2-96e2-ee59c4d47ce8): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/a095b655-0c15-4eb2-96e2-ee59c4d47ce8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 14:26:26 crc kubenswrapper[5004]: > logger="UnhandledError" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.254446 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.256013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749gx\" (UniqueName: \"kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx\") pod \"horizon-554f88d76f-hr4dj\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.258888 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.258923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.258950 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.258973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgl4x\" (UniqueName: \"kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259005 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259048 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259088 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259105 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259150 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kvw2\" (UniqueName: \"kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259223 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.259267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.271342 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-646bh" Dec 03 14:26:26 crc kubenswrapper[5004]: E1203 14:26:26.278371 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/a095b655-0c15-4eb2-96e2-ee59c4d47ce8/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" podUID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.351915 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8csk5"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.353059 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.360980 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgl4x\" (UniqueName: \"kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361028 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361088 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361121 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361139 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361196 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kvw2\" (UniqueName: \"kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361241 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361307 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361322 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.361395 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.362097 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.364125 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7pj4f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.364493 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.364606 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.365088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.365375 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.369442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.369651 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.379869 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.383252 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.399031 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgl4x\" (UniqueName: \"kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.407403 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.407747 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8csk5"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.408430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.408688 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.409357 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.411657 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wv94f" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.413805 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.424622 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kvw2\" (UniqueName: \"kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2\") pod \"ceilometer-0\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.452629 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" event={"ID":"1675bd8d-aafb-4c6a-8814-877430a39bec","Type":"ContainerStarted","Data":"15bd2bc86ef709694841ef52f8d18b76d737503fcaee59d38f3bbe13993b9288"} Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.461603 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.462832 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.462939 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4ds\" (UniqueName: \"kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.463013 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.463063 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.463093 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.465236 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.486378 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.489415 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.510471 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.512286 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.515378 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.521726 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.564890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.564941 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.564968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525l9\" (UniqueName: \"kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565025 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565070 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565119 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565150 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565172 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565197 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4ds\" (UniqueName: \"kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565246 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565283 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565327 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565351 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565376 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565426 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd975\" (UniqueName: \"kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565468 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.565499 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.566087 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.588445 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.590744 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4ds\" (UniqueName: \"kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.593710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.594499 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.594790 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts\") pod \"placement-db-sync-8csk5\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.624652 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.666889 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667264 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667352 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667407 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667486 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667516 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd975\" (UniqueName: \"kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667554 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667585 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667634 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667663 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.667694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-525l9\" (UniqueName: \"kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.668317 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.669411 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.669951 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.670444 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.670942 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.672581 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.672794 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.677866 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.680543 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.682071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.684590 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.686914 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-525l9\" (UniqueName: \"kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.691907 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd975\" (UniqueName: \"kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975\") pod \"dnsmasq-dns-785d8bcb8c-9rkv4\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.695711 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8csk5" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.733228 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.820277 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.847802 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.891043 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.920930 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x52kd"] Dec 03 14:26:26 crc kubenswrapper[5004]: I1203 14:26:26.937466 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.074663 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-c9xfn"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.327600 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.380503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mflpr\" (UniqueName: \"kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr\") pod \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.380562 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb\") pod \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.380597 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc\") pod \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.380645 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config\") pod \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.380686 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb\") pod \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\" (UID: \"a095b655-0c15-4eb2-96e2-ee59c4d47ce8\") " Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.396712 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr" (OuterVolumeSpecName: "kube-api-access-mflpr") pod "a095b655-0c15-4eb2-96e2-ee59c4d47ce8" (UID: "a095b655-0c15-4eb2-96e2-ee59c4d47ce8"). InnerVolumeSpecName "kube-api-access-mflpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.419201 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-646bh"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.453011 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.473625 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c9xfn" event={"ID":"84d38e9d-5aea-4c66-8c17-cc31d9494116","Type":"ContainerStarted","Data":"44395539fd8de7450c7049e999668c5cc194baee6ea5495bf38dc0753002de43"} Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.474610 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.482223 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mflpr\" (UniqueName: \"kubernetes.io/projected/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-kube-api-access-mflpr\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.484137 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" event={"ID":"a095b655-0c15-4eb2-96e2-ee59c4d47ce8","Type":"ContainerDied","Data":"8b06307799472e3184a0a365045304124eb0b410acfc707161b4c177704e79ee"} Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.484191 5004 scope.go:117] "RemoveContainer" containerID="41cbd00adf83b7dde705d5f17977d78a20fa97a69245fb7841b59cd9079675e0" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.484245 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-ft85p" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.489564 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" event={"ID":"c9ab7112-45a5-4a8c-8d61-c8592d47b88d","Type":"ContainerStarted","Data":"12e945a440fc5d38b79f3b51861aa608c7aed8b2bce93491d4925a4e657dade2"} Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.491825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wv94f"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.512594 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x52kd" event={"ID":"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9","Type":"ContainerStarted","Data":"32b34eb532f020647e858657a8a7d35ed8149d1ef4a2babf45736e25719b5930"} Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.517597 5004 generic.go:334] "Generic (PLEG): container finished" podID="1675bd8d-aafb-4c6a-8814-877430a39bec" containerID="47b4158dd40007ff0e256e573db571d18e3ce6e0de6b67b224d6a466fd6eeaa2" exitCode=0 Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.517644 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" event={"ID":"1675bd8d-aafb-4c6a-8814-877430a39bec","Type":"ContainerDied","Data":"47b4158dd40007ff0e256e573db571d18e3ce6e0de6b67b224d6a466fd6eeaa2"} Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.547313 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.562636 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8csk5"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.704553 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a095b655-0c15-4eb2-96e2-ee59c4d47ce8" (UID: "a095b655-0c15-4eb2-96e2-ee59c4d47ce8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.706228 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a095b655-0c15-4eb2-96e2-ee59c4d47ce8" (UID: "a095b655-0c15-4eb2-96e2-ee59c4d47ce8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.715470 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.720831 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config" (OuterVolumeSpecName: "config") pod "a095b655-0c15-4eb2-96e2-ee59c4d47ce8" (UID: "a095b655-0c15-4eb2-96e2-ee59c4d47ce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:27 crc kubenswrapper[5004]: W1203 14:26:27.725506 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d10fd7_16ee_4670_8b57_b2cf118f7530.slice/crio-5f12717a7c89926a9e039d71bb60f754d3e7f541dc43527457e4ad1e0b1785a8 WatchSource:0}: Error finding container 5f12717a7c89926a9e039d71bb60f754d3e7f541dc43527457e4ad1e0b1785a8: Status 404 returned error can't find the container with id 5f12717a7c89926a9e039d71bb60f754d3e7f541dc43527457e4ad1e0b1785a8 Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.734584 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a095b655-0c15-4eb2-96e2-ee59c4d47ce8" (UID: "a095b655-0c15-4eb2-96e2-ee59c4d47ce8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.799770 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.799816 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.799830 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.799842 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a095b655-0c15-4eb2-96e2-ee59c4d47ce8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:27 crc kubenswrapper[5004]: I1203 14:26:27.895231 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.108343 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.116690 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.116715 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-ft85p"] Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.128738 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216467 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216535 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216604 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216674 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216723 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bmvv\" (UniqueName: \"kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.216822 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config\") pod \"1675bd8d-aafb-4c6a-8814-877430a39bec\" (UID: \"1675bd8d-aafb-4c6a-8814-877430a39bec\") " Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.241044 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.257315 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv" (OuterVolumeSpecName: "kube-api-access-6bmvv") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "kube-api-access-6bmvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.257466 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.258303 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config" (OuterVolumeSpecName: "config") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.275975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.284296 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1675bd8d-aafb-4c6a-8814-877430a39bec" (UID: "1675bd8d-aafb-4c6a-8814-877430a39bec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.323250 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.325598 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.325828 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.325956 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.326037 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bmvv\" (UniqueName: \"kubernetes.io/projected/1675bd8d-aafb-4c6a-8814-877430a39bec-kube-api-access-6bmvv\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.326125 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1675bd8d-aafb-4c6a-8814-877430a39bec-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.537448 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x52kd" event={"ID":"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9","Type":"ContainerStarted","Data":"1fc0a9e367fdeabd5336b2b4f36abd921fec677a4c3548ee811b27f9b7633f1d"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.541001 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8csk5" event={"ID":"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6","Type":"ContainerStarted","Data":"eaed4a63d81a75ec22cf77d62ba4c382d24394e714be015e01b342ebf4f75436"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.542618 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8645b8f4d9-4xc7j" event={"ID":"893b17ae-7521-48d8-8285-75b91f9f0936","Type":"ContainerStarted","Data":"932a26b575269cedad8fb33fc2be370f895e8cbfe338a6d7151d99aa45409a5d"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.545626 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerStarted","Data":"41da35e897c9883742c8575add1344efa13b1cd73ddab11a094bd1eb82979ed0"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.547784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" event={"ID":"1675bd8d-aafb-4c6a-8814-877430a39bec","Type":"ContainerDied","Data":"15bd2bc86ef709694841ef52f8d18b76d737503fcaee59d38f3bbe13993b9288"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.547834 5004 scope.go:117] "RemoveContainer" containerID="47b4158dd40007ff0e256e573db571d18e3ce6e0de6b67b224d6a466fd6eeaa2" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.547846 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-lndlt" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.551777 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" event={"ID":"63d10fd7-16ee-4670-8b57-b2cf118f7530","Type":"ContainerStarted","Data":"5f12717a7c89926a9e039d71bb60f754d3e7f541dc43527457e4ad1e0b1785a8"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.556969 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x52kd" podStartSLOduration=3.556951924 podStartE2EDuration="3.556951924s" podCreationTimestamp="2025-12-03 14:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:28.555038869 +0000 UTC m=+1201.304009115" watchObservedRunningTime="2025-12-03 14:26:28.556951924 +0000 UTC m=+1201.305922160" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.563060 5004 generic.go:334] "Generic (PLEG): container finished" podID="c9ab7112-45a5-4a8c-8d61-c8592d47b88d" containerID="bf1b06170aed0ce5b23ad342d4c2bc34877c01a1a68b1090d9237b6aaab8434b" exitCode=0 Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.563116 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" event={"ID":"c9ab7112-45a5-4a8c-8d61-c8592d47b88d","Type":"ContainerDied","Data":"bf1b06170aed0ce5b23ad342d4c2bc34877c01a1a68b1090d9237b6aaab8434b"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.568987 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-646bh" event={"ID":"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52","Type":"ContainerStarted","Data":"4406f065d05c6775ca615bf6c0c332e4274964e4b9d6c1ed60edede50c023bab"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.571294 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554f88d76f-hr4dj" event={"ID":"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8","Type":"ContainerStarted","Data":"ee92a19af896152991a9f35f8762e7e00367522d1ffedff2a979e1e90fe0f122"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.573891 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c9xfn" event={"ID":"84d38e9d-5aea-4c66-8c17-cc31d9494116","Type":"ContainerStarted","Data":"7868aa0b8066bfecd1cb2ca83f6c3f89c7491195180e8311bc7539b3977d899e"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.577125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wv94f" event={"ID":"82059d63-43a0-43ed-b9ea-9c54f700a2dc","Type":"ContainerStarted","Data":"dfe60405dbb7ac61700ff0b03b8f19de13b7a02817cfb1d04e573f0ce65c041a"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.585609 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerStarted","Data":"a7ba8666bbc439e42afbdbca1664998d592526e558e6afdfe65e813a67817932"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.587304 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerStarted","Data":"ab4888e680886af6dcdf3e540fb303504122f430ee43f35d5e29af27f467652e"} Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.600035 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-c9xfn" podStartSLOduration=3.600017165 podStartE2EDuration="3.600017165s" podCreationTimestamp="2025-12-03 14:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:28.596063402 +0000 UTC m=+1201.345033638" watchObservedRunningTime="2025-12-03 14:26:28.600017165 +0000 UTC m=+1201.348987401" Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.706711 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:28 crc kubenswrapper[5004]: I1203 14:26:28.719233 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-lndlt"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.364125 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.427644 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.441718 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.459607 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:26:29 crc kubenswrapper[5004]: E1203 14:26:29.460187 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1675bd8d-aafb-4c6a-8814-877430a39bec" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.460206 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1675bd8d-aafb-4c6a-8814-877430a39bec" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: E1203 14:26:29.460242 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.460252 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.460423 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.460437 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1675bd8d-aafb-4c6a-8814-877430a39bec" containerName="init" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.461366 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.476231 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.488514 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.548123 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.548527 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.548646 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpsrn\" (UniqueName: \"kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.548789 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.548817 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.634750 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1675bd8d-aafb-4c6a-8814-877430a39bec" path="/var/lib/kubelet/pods/1675bd8d-aafb-4c6a-8814-877430a39bec/volumes" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.635484 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a095b655-0c15-4eb2-96e2-ee59c4d47ce8" path="/var/lib/kubelet/pods/a095b655-0c15-4eb2-96e2-ee59c4d47ce8/volumes" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.650557 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.650620 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.650777 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.650906 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.650988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpsrn\" (UniqueName: \"kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.651719 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.651788 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.652640 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.675164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.684624 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpsrn\" (UniqueName: \"kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn\") pod \"horizon-868d7445fc-kvh87\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:29 crc kubenswrapper[5004]: I1203 14:26:29.783028 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.642422 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" event={"ID":"c9ab7112-45a5-4a8c-8d61-c8592d47b88d","Type":"ContainerDied","Data":"12e945a440fc5d38b79f3b51861aa608c7aed8b2bce93491d4925a4e657dade2"} Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.642804 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e945a440fc5d38b79f3b51861aa608c7aed8b2bce93491d4925a4e657dade2" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.708564 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.789474 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.789521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.789627 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.789678 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.789723 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whdpc\" (UniqueName: \"kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.790432 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0\") pod \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\" (UID: \"c9ab7112-45a5-4a8c-8d61-c8592d47b88d\") " Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.795177 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc" (OuterVolumeSpecName: "kube-api-access-whdpc") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "kube-api-access-whdpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.811910 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.817891 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.819238 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.820694 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.825404 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config" (OuterVolumeSpecName: "config") pod "c9ab7112-45a5-4a8c-8d61-c8592d47b88d" (UID: "c9ab7112-45a5-4a8c-8d61-c8592d47b88d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892596 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892653 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892667 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892678 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892690 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whdpc\" (UniqueName: \"kubernetes.io/projected/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-kube-api-access-whdpc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:31 crc kubenswrapper[5004]: I1203 14:26:31.892705 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c9ab7112-45a5-4a8c-8d61-c8592d47b88d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:32 crc kubenswrapper[5004]: W1203 14:26:32.064789 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24c4c4e3_c00c_470e_ba56_3e50b7f8187f.slice/crio-46bfd0481e53fe00932e62eedcf57af4e669e8a1feee9ceef239e2255efeac0d WatchSource:0}: Error finding container 46bfd0481e53fe00932e62eedcf57af4e669e8a1feee9ceef239e2255efeac0d: Status 404 returned error can't find the container with id 46bfd0481e53fe00932e62eedcf57af4e669e8a1feee9ceef239e2255efeac0d Dec 03 14:26:32 crc kubenswrapper[5004]: I1203 14:26:32.070744 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:26:32 crc kubenswrapper[5004]: I1203 14:26:32.650338 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868d7445fc-kvh87" event={"ID":"24c4c4e3-c00c-470e-ba56-3e50b7f8187f","Type":"ContainerStarted","Data":"46bfd0481e53fe00932e62eedcf57af4e669e8a1feee9ceef239e2255efeac0d"} Dec 03 14:26:32 crc kubenswrapper[5004]: I1203 14:26:32.650383 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gkdfh" Dec 03 14:26:32 crc kubenswrapper[5004]: I1203 14:26:32.747478 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:32 crc kubenswrapper[5004]: I1203 14:26:32.754499 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gkdfh"] Dec 03 14:26:33 crc kubenswrapper[5004]: I1203 14:26:33.650902 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ab7112-45a5-4a8c-8d61-c8592d47b88d" path="/var/lib/kubelet/pods/c9ab7112-45a5-4a8c-8d61-c8592d47b88d/volumes" Dec 03 14:26:33 crc kubenswrapper[5004]: I1203 14:26:33.675061 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerStarted","Data":"fb1a05af2c98bb6795aac23cb691141ae23f7f386536082f83576c7310db61b6"} Dec 03 14:26:33 crc kubenswrapper[5004]: I1203 14:26:33.687838 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerID="8c1a093e281e3599558b4236f645d67964df02fcf656556ed573088d2ea345cc" exitCode=0 Dec 03 14:26:33 crc kubenswrapper[5004]: I1203 14:26:33.687921 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" event={"ID":"63d10fd7-16ee-4670-8b57-b2cf118f7530","Type":"ContainerDied","Data":"8c1a093e281e3599558b4236f645d67964df02fcf656556ed573088d2ea345cc"} Dec 03 14:26:33 crc kubenswrapper[5004]: I1203 14:26:33.729599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerStarted","Data":"3ae4bc4f0debca898c60eb27dad9f3e2b8d86b35b59453f7e8416be6a0472150"} Dec 03 14:26:34 crc kubenswrapper[5004]: I1203 14:26:34.740414 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerStarted","Data":"30f64a868a4232df09656f53cf6bedfcd6f9d4e6d8ed08ee56e379f6b290cd22"} Dec 03 14:26:34 crc kubenswrapper[5004]: I1203 14:26:34.745739 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerStarted","Data":"66f23ad167f111d0d85847cc13469b12a5a184610c48ca541d9f23a59fc74f17"} Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.764414 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-log" containerID="cri-o://fb1a05af2c98bb6795aac23cb691141ae23f7f386536082f83576c7310db61b6" gracePeriod=30 Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.765610 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" event={"ID":"63d10fd7-16ee-4670-8b57-b2cf118f7530","Type":"ContainerStarted","Data":"f108915e52d7dde700f8ca6ec043e4a50c3383cd60c414f632df9f8fab1b30ad"} Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.765649 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.765751 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-log" containerID="cri-o://3ae4bc4f0debca898c60eb27dad9f3e2b8d86b35b59453f7e8416be6a0472150" gracePeriod=30 Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.766231 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-httpd" containerID="cri-o://30f64a868a4232df09656f53cf6bedfcd6f9d4e6d8ed08ee56e379f6b290cd22" gracePeriod=30 Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.766477 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-httpd" containerID="cri-o://66f23ad167f111d0d85847cc13469b12a5a184610c48ca541d9f23a59fc74f17" gracePeriod=30 Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.804775 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" podStartSLOduration=9.804753138 podStartE2EDuration="9.804753138s" podCreationTimestamp="2025-12-03 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:35.793421694 +0000 UTC m=+1208.542391940" watchObservedRunningTime="2025-12-03 14:26:35.804753138 +0000 UTC m=+1208.553723384" Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.826093 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.826073498 podStartE2EDuration="9.826073498s" podCreationTimestamp="2025-12-03 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:35.812729466 +0000 UTC m=+1208.561699712" watchObservedRunningTime="2025-12-03 14:26:35.826073498 +0000 UTC m=+1208.575043734" Dec 03 14:26:35 crc kubenswrapper[5004]: I1203 14:26:35.846310 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.846288326 podStartE2EDuration="10.846288326s" podCreationTimestamp="2025-12-03 14:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:26:35.843526817 +0000 UTC m=+1208.592497063" watchObservedRunningTime="2025-12-03 14:26:35.846288326 +0000 UTC m=+1208.595258562" Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.781188 5004 generic.go:334] "Generic (PLEG): container finished" podID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerID="66f23ad167f111d0d85847cc13469b12a5a184610c48ca541d9f23a59fc74f17" exitCode=0 Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.781612 5004 generic.go:334] "Generic (PLEG): container finished" podID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerID="3ae4bc4f0debca898c60eb27dad9f3e2b8d86b35b59453f7e8416be6a0472150" exitCode=143 Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.781532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerDied","Data":"66f23ad167f111d0d85847cc13469b12a5a184610c48ca541d9f23a59fc74f17"} Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.781723 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerDied","Data":"3ae4bc4f0debca898c60eb27dad9f3e2b8d86b35b59453f7e8416be6a0472150"} Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.790078 5004 generic.go:334] "Generic (PLEG): container finished" podID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerID="30f64a868a4232df09656f53cf6bedfcd6f9d4e6d8ed08ee56e379f6b290cd22" exitCode=0 Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.790115 5004 generic.go:334] "Generic (PLEG): container finished" podID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerID="fb1a05af2c98bb6795aac23cb691141ae23f7f386536082f83576c7310db61b6" exitCode=143 Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.790147 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerDied","Data":"30f64a868a4232df09656f53cf6bedfcd6f9d4e6d8ed08ee56e379f6b290cd22"} Dec 03 14:26:36 crc kubenswrapper[5004]: I1203 14:26:36.790211 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerDied","Data":"fb1a05af2c98bb6795aac23cb691141ae23f7f386536082f83576c7310db61b6"} Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.816462 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.831900 5004 generic.go:334] "Generic (PLEG): container finished" podID="b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" containerID="1fc0a9e367fdeabd5336b2b4f36abd921fec677a4c3548ee811b27f9b7633f1d" exitCode=0 Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.832096 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x52kd" event={"ID":"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9","Type":"ContainerDied","Data":"1fc0a9e367fdeabd5336b2b4f36abd921fec677a4c3548ee811b27f9b7633f1d"} Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.847451 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:26:37 crc kubenswrapper[5004]: E1203 14:26:37.847838 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ab7112-45a5-4a8c-8d61-c8592d47b88d" containerName="init" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.847869 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ab7112-45a5-4a8c-8d61-c8592d47b88d" containerName="init" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.848078 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ab7112-45a5-4a8c-8d61-c8592d47b88d" containerName="init" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.848981 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.852767 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.871351 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915130 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915181 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915266 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915340 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22lzn\" (UniqueName: \"kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915762 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.915843 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.924575 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.971185 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79df97d86b-4dr9p"] Dec 03 14:26:37 crc kubenswrapper[5004]: I1203 14:26:37.975046 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.016257 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79df97d86b-4dr9p"] Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.019670 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.019820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.020015 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.020043 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.020086 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.020139 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22lzn\" (UniqueName: \"kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.020258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.023544 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.025154 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.025593 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.050338 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.051640 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.052560 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.056149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22lzn\" (UniqueName: \"kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn\") pod \"horizon-587bd47d68-c6stc\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123136 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559lt\" (UniqueName: \"kubernetes.io/projected/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-kube-api-access-559lt\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123239 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-scripts\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123345 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-tls-certs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123380 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-logs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123413 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-config-data\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123446 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-secret-key\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.123501 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-combined-ca-bundle\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.174530 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-scripts\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-tls-certs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225228 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-logs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225254 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-config-data\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-secret-key\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225303 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-combined-ca-bundle\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.225337 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559lt\" (UniqueName: \"kubernetes.io/projected/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-kube-api-access-559lt\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.227149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-logs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.227835 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-config-data\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.227900 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-scripts\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.244831 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-combined-ca-bundle\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.244966 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-secret-key\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.248494 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-horizon-tls-certs\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.249162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559lt\" (UniqueName: \"kubernetes.io/projected/e0f1c734-5c6e-4f15-8f11-1e3c1da2d880-kube-api-access-559lt\") pod \"horizon-79df97d86b-4dr9p\" (UID: \"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880\") " pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:38 crc kubenswrapper[5004]: I1203 14:26:38.408888 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:26:41 crc kubenswrapper[5004]: I1203 14:26:41.822675 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:26:41 crc kubenswrapper[5004]: I1203 14:26:41.874222 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:26:41 crc kubenswrapper[5004]: I1203 14:26:41.874493 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" containerID="cri-o://2ca4bb0cdf21fa6794a7b1bd8a34aec197c49285cc29973851b12e761801543a" gracePeriod=10 Dec 03 14:26:42 crc kubenswrapper[5004]: I1203 14:26:42.271697 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 03 14:26:42 crc kubenswrapper[5004]: I1203 14:26:42.912374 5004 generic.go:334] "Generic (PLEG): container finished" podID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerID="2ca4bb0cdf21fa6794a7b1bd8a34aec197c49285cc29973851b12e761801543a" exitCode=0 Dec 03 14:26:42 crc kubenswrapper[5004]: I1203 14:26:42.912424 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rzqx7" event={"ID":"e604b4a5-cf48-4060-b7f2-556bca7840d3","Type":"ContainerDied","Data":"2ca4bb0cdf21fa6794a7b1bd8a34aec197c49285cc29973851b12e761801543a"} Dec 03 14:26:47 crc kubenswrapper[5004]: I1203 14:26:47.272388 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.040256 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.040711 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5f8h568hb9h655hc6h64fh644hfh56bh548h586hbbh65fh56h566h65h85h546h5fdh5b5h5dfh678h689h5dbh97h6ch676h688h55h646h578q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7gz4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8645b8f4d9-4xc7j_openstack(893b17ae-7521-48d8-8285-75b91f9f0936): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.043432 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8645b8f4d9-4xc7j" podUID="893b17ae-7521-48d8-8285-75b91f9f0936" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.058918 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.059090 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbfh579hb5hb6h68dh664h595h5f4h55fh6fh559h54chfbh5d8hddh5cfh6ch78h647h96h648h5cdhc6h55fh587h57ch65fh5c8h5dbh649h5ch695q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-749gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-554f88d76f-hr4dj_openstack(119b6d49-de5d-42d6-9bbb-4a8a5403c5b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:26:49 crc kubenswrapper[5004]: E1203 14:26:49.061461 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-554f88d76f-hr4dj" podUID="119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" Dec 03 14:26:51 crc kubenswrapper[5004]: E1203 14:26:51.063754 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 03 14:26:51 crc kubenswrapper[5004]: E1203 14:26:51.064461 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb4h7ch64h5ch66h56dh699hc8h8dhb4h64fh687h5b8h5bhc5h5f7h644h689h55bh687h599h5c6h66dh644h59h575h68ch64ch677h595hd7hbdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jpsrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-868d7445fc-kvh87_openstack(24c4c4e3-c00c-470e-ba56-3e50b7f8187f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:26:51 crc kubenswrapper[5004]: E1203 14:26:51.066659 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-868d7445fc-kvh87" podUID="24c4c4e3-c00c-470e-ba56-3e50b7f8187f" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.193901 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.194844 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305484 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305544 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305576 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305600 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305627 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgl4x\" (UniqueName: \"kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305682 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9pww\" (UniqueName: \"kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305718 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305895 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305922 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305939 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys\") pod \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\" (UID: \"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305955 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.305983 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data\") pod \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\" (UID: \"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8\") " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.310421 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.312265 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs" (OuterVolumeSpecName: "logs") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.318651 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts" (OuterVolumeSpecName: "scripts") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.318657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts" (OuterVolumeSpecName: "scripts") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.318804 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.318810 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww" (OuterVolumeSpecName: "kube-api-access-q9pww") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "kube-api-access-q9pww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.321725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.356415 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.357143 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x" (OuterVolumeSpecName: "kube-api-access-xgl4x") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "kube-api-access-xgl4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.365829 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.372540 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data" (OuterVolumeSpecName: "config-data") pod "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" (UID: "b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.386522 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410106 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410171 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410184 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410195 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410229 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410258 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410271 5004 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410303 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410317 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgl4x\" (UniqueName: \"kubernetes.io/projected/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-kube-api-access-xgl4x\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410333 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9pww\" (UniqueName: \"kubernetes.io/projected/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9-kube-api-access-q9pww\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410345 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.410358 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.426695 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data" (OuterVolumeSpecName: "config-data") pod "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" (UID: "f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.450960 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.511491 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.511512 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.568912 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79df97d86b-4dr9p"] Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.983485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x52kd" event={"ID":"b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9","Type":"ContainerDied","Data":"32b34eb532f020647e858657a8a7d35ed8149d1ef4a2babf45736e25719b5930"} Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.983793 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32b34eb532f020647e858657a8a7d35ed8149d1ef4a2babf45736e25719b5930" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.983522 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x52kd" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.987130 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.987206 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8","Type":"ContainerDied","Data":"ab4888e680886af6dcdf3e540fb303504122f430ee43f35d5e29af27f467652e"} Dec 03 14:26:51 crc kubenswrapper[5004]: I1203 14:26:51.987249 5004 scope.go:117] "RemoveContainer" containerID="30f64a868a4232df09656f53cf6bedfcd6f9d4e6d8ed08ee56e379f6b290cd22" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.043724 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.060804 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071011 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:52 crc kubenswrapper[5004]: E1203 14:26:52.071651 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-log" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071669 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-log" Dec 03 14:26:52 crc kubenswrapper[5004]: E1203 14:26:52.071699 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" containerName="keystone-bootstrap" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071706 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" containerName="keystone-bootstrap" Dec 03 14:26:52 crc kubenswrapper[5004]: E1203 14:26:52.071735 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-httpd" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071741 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-httpd" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071970 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" containerName="keystone-bootstrap" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.071994 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-httpd" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.072011 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" containerName="glance-log" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.073495 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.076795 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.077129 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.100273 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.232439 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.232531 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jc8j\" (UniqueName: \"kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.232569 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.232668 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.232908 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.233196 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.233228 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.233253 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334683 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jc8j\" (UniqueName: \"kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334750 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334810 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.334986 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.335365 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.335532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.335934 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.336026 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.336799 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.345014 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.345532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.346584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.346963 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.359839 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jc8j\" (UniqueName: \"kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.380113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.391911 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x52kd"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.415706 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x52kd"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.431675 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.505248 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xpj88"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.517396 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.523529 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.525338 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.525574 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.525767 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.541244 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4rh8q" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.569644 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xpj88"] Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648104 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6nv\" (UniqueName: \"kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648183 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648224 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648249 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648308 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.648332 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750394 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6nv\" (UniqueName: \"kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750472 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750573 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750653 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.750695 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.755284 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.756064 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.763622 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.763777 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.764201 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.779914 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6nv\" (UniqueName: \"kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv\") pod \"keystone-bootstrap-xpj88\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.824893 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.824952 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:26:52 crc kubenswrapper[5004]: I1203 14:26:52.863085 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:26:53 crc kubenswrapper[5004]: I1203 14:26:53.639739 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9" path="/var/lib/kubelet/pods/b03d7d0f-14e6-46ec-bea4-0a0e57cf1ce9/volumes" Dec 03 14:26:53 crc kubenswrapper[5004]: I1203 14:26:53.640447 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8" path="/var/lib/kubelet/pods/f0ba9e3c-1eec-4a8b-a4fe-1d40dc74d6b8/volumes" Dec 03 14:26:56 crc kubenswrapper[5004]: I1203 14:26:56.847971 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:26:56 crc kubenswrapper[5004]: I1203 14:26:56.848796 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:26:57 crc kubenswrapper[5004]: I1203 14:26:57.271714 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 03 14:26:57 crc kubenswrapper[5004]: I1203 14:26:57.272304 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:27:01 crc kubenswrapper[5004]: E1203 14:27:01.542557 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 14:27:01 crc kubenswrapper[5004]: E1203 14:27:01.543540 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fg6xs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-646bh_openstack(c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:27:01 crc kubenswrapper[5004]: E1203 14:27:01.544752 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-646bh" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.632063 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.650970 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735576 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key\") pod \"893b17ae-7521-48d8-8285-75b91f9f0936\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735651 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key\") pod \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735680 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts\") pod \"893b17ae-7521-48d8-8285-75b91f9f0936\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735745 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f\") pod \"893b17ae-7521-48d8-8285-75b91f9f0936\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735781 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data\") pod \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735809 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts\") pod \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735847 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs\") pod \"893b17ae-7521-48d8-8285-75b91f9f0936\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735930 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs\") pod \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.735980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749gx\" (UniqueName: \"kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx\") pod \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\" (UID: \"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736072 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data\") pod \"893b17ae-7521-48d8-8285-75b91f9f0936\" (UID: \"893b17ae-7521-48d8-8285-75b91f9f0936\") " Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736428 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs" (OuterVolumeSpecName: "logs") pod "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" (UID: "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736597 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736598 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts" (OuterVolumeSpecName: "scripts") pod "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" (UID: "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736663 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts" (OuterVolumeSpecName: "scripts") pod "893b17ae-7521-48d8-8285-75b91f9f0936" (UID: "893b17ae-7521-48d8-8285-75b91f9f0936"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736738 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data" (OuterVolumeSpecName: "config-data") pod "893b17ae-7521-48d8-8285-75b91f9f0936" (UID: "893b17ae-7521-48d8-8285-75b91f9f0936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.736844 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs" (OuterVolumeSpecName: "logs") pod "893b17ae-7521-48d8-8285-75b91f9f0936" (UID: "893b17ae-7521-48d8-8285-75b91f9f0936"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.737465 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data" (OuterVolumeSpecName: "config-data") pod "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" (UID: "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.742049 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" (UID: "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.744278 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "893b17ae-7521-48d8-8285-75b91f9f0936" (UID: "893b17ae-7521-48d8-8285-75b91f9f0936"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.745039 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx" (OuterVolumeSpecName: "kube-api-access-749gx") pod "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" (UID: "119b6d49-de5d-42d6-9bbb-4a8a5403c5b8"). InnerVolumeSpecName "kube-api-access-749gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.750434 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f" (OuterVolumeSpecName: "kube-api-access-7gz4f") pod "893b17ae-7521-48d8-8285-75b91f9f0936" (UID: "893b17ae-7521-48d8-8285-75b91f9f0936"). InnerVolumeSpecName "kube-api-access-7gz4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837657 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gz4f\" (UniqueName: \"kubernetes.io/projected/893b17ae-7521-48d8-8285-75b91f9f0936-kube-api-access-7gz4f\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837691 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837702 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837715 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/893b17ae-7521-48d8-8285-75b91f9f0936-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837728 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-749gx\" (UniqueName: \"kubernetes.io/projected/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-kube-api-access-749gx\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837739 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837750 5004 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/893b17ae-7521-48d8-8285-75b91f9f0936-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837760 5004 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.837771 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893b17ae-7521-48d8-8285-75b91f9f0936-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:01 crc kubenswrapper[5004]: W1203 14:27:01.922660 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0f1c734_5c6e_4f15_8f11_1e3c1da2d880.slice/crio-c87b348738f12fdcb6a5d39ee5d389e98ac164ff959e553a7ad08837cb70fe4c WatchSource:0}: Error finding container c87b348738f12fdcb6a5d39ee5d389e98ac164ff959e553a7ad08837cb70fe4c: Status 404 returned error can't find the container with id c87b348738f12fdcb6a5d39ee5d389e98ac164ff959e553a7ad08837cb70fe4c Dec 03 14:27:01 crc kubenswrapper[5004]: E1203 14:27:01.922961 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 03 14:27:01 crc kubenswrapper[5004]: E1203 14:27:01.923527 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n695h65h5d9h6bh66ch65ch589h8fh9bh65h57h68dh674hd5h8chbbh55h57fh5b5h5b6hf9h687h555h649h656h58dh5cbh56bhfh5dh677h666q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kvw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a3dd33fc-70e6-4c71-903a-1337fa225e82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.930377 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.940244 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:27:01 crc kubenswrapper[5004]: I1203 14:27:01.948097 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040278 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data\") pod \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040615 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040641 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs\") pod \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040677 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb\") pod \"e604b4a5-cf48-4060-b7f2-556bca7840d3\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040708 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc\") pod \"e604b4a5-cf48-4060-b7f2-556bca7840d3\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040731 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpsrn\" (UniqueName: \"kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn\") pod \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040757 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm44z\" (UniqueName: \"kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z\") pod \"e604b4a5-cf48-4060-b7f2-556bca7840d3\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key\") pod \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040819 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config\") pod \"e604b4a5-cf48-4060-b7f2-556bca7840d3\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040839 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040918 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.040997 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041031 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041072 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts\") pod \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\" (UID: \"24c4c4e3-c00c-470e-ba56-3e50b7f8187f\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041135 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-525l9\" (UniqueName: \"kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041170 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb\") pod \"e604b4a5-cf48-4060-b7f2-556bca7840d3\" (UID: \"e604b4a5-cf48-4060-b7f2-556bca7840d3\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041218 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs\") pod \"3f11834c-c204-4198-b3c3-ccd29b7b4882\" (UID: \"3f11834c-c204-4198-b3c3-ccd29b7b4882\") " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041342 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data" (OuterVolumeSpecName: "config-data") pod "24c4c4e3-c00c-470e-ba56-3e50b7f8187f" (UID: "24c4c4e3-c00c-470e-ba56-3e50b7f8187f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.041949 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.042233 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts" (OuterVolumeSpecName: "scripts") pod "24c4c4e3-c00c-470e-ba56-3e50b7f8187f" (UID: "24c4c4e3-c00c-470e-ba56-3e50b7f8187f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.042842 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.044770 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs" (OuterVolumeSpecName: "logs") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.048485 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.048515 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs" (OuterVolumeSpecName: "logs") pod "24c4c4e3-c00c-470e-ba56-3e50b7f8187f" (UID: "24c4c4e3-c00c-470e-ba56-3e50b7f8187f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.053321 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts" (OuterVolumeSpecName: "scripts") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.054163 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn" (OuterVolumeSpecName: "kube-api-access-jpsrn") pod "24c4c4e3-c00c-470e-ba56-3e50b7f8187f" (UID: "24c4c4e3-c00c-470e-ba56-3e50b7f8187f"). InnerVolumeSpecName "kube-api-access-jpsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.056451 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "24c4c4e3-c00c-470e-ba56-3e50b7f8187f" (UID: "24c4c4e3-c00c-470e-ba56-3e50b7f8187f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.059993 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9" (OuterVolumeSpecName: "kube-api-access-525l9") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "kube-api-access-525l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.064188 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z" (OuterVolumeSpecName: "kube-api-access-pm44z") pod "e604b4a5-cf48-4060-b7f2-556bca7840d3" (UID: "e604b4a5-cf48-4060-b7f2-556bca7840d3"). InnerVolumeSpecName "kube-api-access-pm44z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.091452 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8645b8f4d9-4xc7j" event={"ID":"893b17ae-7521-48d8-8285-75b91f9f0936","Type":"ContainerDied","Data":"932a26b575269cedad8fb33fc2be370f895e8cbfe338a6d7151d99aa45409a5d"} Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.091591 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8645b8f4d9-4xc7j" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.097474 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868d7445fc-kvh87" event={"ID":"24c4c4e3-c00c-470e-ba56-3e50b7f8187f","Type":"ContainerDied","Data":"46bfd0481e53fe00932e62eedcf57af4e669e8a1feee9ceef239e2255efeac0d"} Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.097591 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868d7445fc-kvh87" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.116290 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.125756 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rzqx7" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.126583 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rzqx7" event={"ID":"e604b4a5-cf48-4060-b7f2-556bca7840d3","Type":"ContainerDied","Data":"fe3e5b36e0ce02efaffbb0dea40947422a194f94e8b0914e9c4fbd3514ec68f9"} Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.129070 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f11834c-c204-4198-b3c3-ccd29b7b4882","Type":"ContainerDied","Data":"41da35e897c9883742c8575add1344efa13b1cd73ddab11a094bd1eb82979ed0"} Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.129196 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.129589 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e604b4a5-cf48-4060-b7f2-556bca7840d3" (UID: "e604b4a5-cf48-4060-b7f2-556bca7840d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.134250 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e604b4a5-cf48-4060-b7f2-556bca7840d3" (UID: "e604b4a5-cf48-4060-b7f2-556bca7840d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.135525 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79df97d86b-4dr9p" event={"ID":"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880","Type":"ContainerStarted","Data":"c87b348738f12fdcb6a5d39ee5d389e98ac164ff959e553a7ad08837cb70fe4c"} Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.139877 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f88d76f-hr4dj" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.140536 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config" (OuterVolumeSpecName: "config") pod "e604b4a5-cf48-4060-b7f2-556bca7840d3" (UID: "e604b4a5-cf48-4060-b7f2-556bca7840d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.140667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554f88d76f-hr4dj" event={"ID":"119b6d49-de5d-42d6-9bbb-4a8a5403c5b8","Type":"ContainerDied","Data":"ee92a19af896152991a9f35f8762e7e00367522d1ffedff2a979e1e90fe0f122"} Dec 03 14:27:02 crc kubenswrapper[5004]: E1203 14:27:02.144208 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-646bh" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144696 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144736 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144750 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144785 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-525l9\" (UniqueName: \"kubernetes.io/projected/3f11834c-c204-4198-b3c3-ccd29b7b4882-kube-api-access-525l9\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144813 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144824 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144872 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144885 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144895 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144906 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpsrn\" (UniqueName: \"kubernetes.io/projected/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-kube-api-access-jpsrn\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.144940 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm44z\" (UniqueName: \"kubernetes.io/projected/e604b4a5-cf48-4060-b7f2-556bca7840d3-kube-api-access-pm44z\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.146436 5004 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24c4c4e3-c00c-470e-ba56-3e50b7f8187f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.146457 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.146470 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f11834c-c204-4198-b3c3-ccd29b7b4882-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.161758 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data" (OuterVolumeSpecName: "config-data") pod "3f11834c-c204-4198-b3c3-ccd29b7b4882" (UID: "3f11834c-c204-4198-b3c3-ccd29b7b4882"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.176688 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e604b4a5-cf48-4060-b7f2-556bca7840d3" (UID: "e604b4a5-cf48-4060-b7f2-556bca7840d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.181142 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.248297 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.248336 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e604b4a5-cf48-4060-b7f2-556bca7840d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.248346 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f11834c-c204-4198-b3c3-ccd29b7b4882-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.272199 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rzqx7" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.279948 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.295820 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-554f88d76f-hr4dj"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.316940 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.332618 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-868d7445fc-kvh87"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.379483 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.391135 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8645b8f4d9-4xc7j"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.415112 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.467305 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.481077 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rzqx7"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.509038 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.516295 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.522653 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:27:02 crc kubenswrapper[5004]: E1203 14:27:02.523063 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523085 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" Dec 03 14:27:02 crc kubenswrapper[5004]: E1203 14:27:02.523099 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="init" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523106 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="init" Dec 03 14:27:02 crc kubenswrapper[5004]: E1203 14:27:02.523117 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-log" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523123 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-log" Dec 03 14:27:02 crc kubenswrapper[5004]: E1203 14:27:02.523133 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-httpd" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523138 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-httpd" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523339 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-log" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523367 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" containerName="glance-httpd" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.523379 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" containerName="dnsmasq-dns" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.524339 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.527975 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.528060 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.532259 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564272 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564343 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wzq\" (UniqueName: \"kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564417 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564469 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.564495 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667013 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667130 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667182 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wzq\" (UniqueName: \"kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667243 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667269 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667311 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667329 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.667943 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.668420 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.668575 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.673171 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.673199 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.673811 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.674299 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.687726 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wzq\" (UniqueName: \"kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.696801 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:27:02 crc kubenswrapper[5004]: I1203 14:27:02.868304 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.534797 5004 scope.go:117] "RemoveContainer" containerID="fb1a05af2c98bb6795aac23cb691141ae23f7f386536082f83576c7310db61b6" Dec 03 14:27:03 crc kubenswrapper[5004]: E1203 14:27:03.541940 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 14:27:03 crc kubenswrapper[5004]: E1203 14:27:03.542132 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zsgqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wv94f_openstack(82059d63-43a0-43ed-b9ea-9c54f700a2dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:27:03 crc kubenswrapper[5004]: E1203 14:27:03.543625 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wv94f" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.604591 5004 scope.go:117] "RemoveContainer" containerID="2ca4bb0cdf21fa6794a7b1bd8a34aec197c49285cc29973851b12e761801543a" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.624700 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119b6d49-de5d-42d6-9bbb-4a8a5403c5b8" path="/var/lib/kubelet/pods/119b6d49-de5d-42d6-9bbb-4a8a5403c5b8/volumes" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.666406 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24c4c4e3-c00c-470e-ba56-3e50b7f8187f" path="/var/lib/kubelet/pods/24c4c4e3-c00c-470e-ba56-3e50b7f8187f/volumes" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.667370 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f11834c-c204-4198-b3c3-ccd29b7b4882" path="/var/lib/kubelet/pods/3f11834c-c204-4198-b3c3-ccd29b7b4882/volumes" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.668644 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="893b17ae-7521-48d8-8285-75b91f9f0936" path="/var/lib/kubelet/pods/893b17ae-7521-48d8-8285-75b91f9f0936/volumes" Dec 03 14:27:03 crc kubenswrapper[5004]: I1203 14:27:03.669961 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e604b4a5-cf48-4060-b7f2-556bca7840d3" path="/var/lib/kubelet/pods/e604b4a5-cf48-4060-b7f2-556bca7840d3/volumes" Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.132899 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xpj88"] Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.157192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerStarted","Data":"c8f8d88fd60e9feec438572e394893a90bc99976e8ad2b5a4c5ac9a1d6c15b74"} Dec 03 14:27:04 crc kubenswrapper[5004]: E1203 14:27:04.222682 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wv94f" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" Dec 03 14:27:04 crc kubenswrapper[5004]: W1203 14:27:04.260776 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124d03f5_14a3_430b_acb8_4a2b2fb79d37.slice/crio-73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71 WatchSource:0}: Error finding container 73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71: Status 404 returned error can't find the container with id 73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71 Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.293060 5004 scope.go:117] "RemoveContainer" containerID="d608c980c134d9a189e869f1028e1e66ff452f3c9ff41a542698436fb170e3db" Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.372189 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.570330 5004 scope.go:117] "RemoveContainer" containerID="66f23ad167f111d0d85847cc13469b12a5a184610c48ca541d9f23a59fc74f17" Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.618161 5004 scope.go:117] "RemoveContainer" containerID="3ae4bc4f0debca898c60eb27dad9f3e2b8d86b35b59453f7e8416be6a0472150" Dec 03 14:27:04 crc kubenswrapper[5004]: I1203 14:27:04.877676 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.175771 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8csk5" event={"ID":"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6","Type":"ContainerStarted","Data":"29531f1ce2a7659a3479164bbf9895db81399c25c170f8e20b074c4f392b167c"} Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.177107 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerStarted","Data":"fb216181e3a08602415beff525613ffb1d46709cc8299b1baf3a5796025978a6"} Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.178680 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpj88" event={"ID":"124d03f5-14a3-430b-acb8-4a2b2fb79d37","Type":"ContainerStarted","Data":"d2b99edabb08722467a07ede584711b4db2a567eb0c271df6d45257baa7fcb24"} Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.178708 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpj88" event={"ID":"124d03f5-14a3-430b-acb8-4a2b2fb79d37","Type":"ContainerStarted","Data":"73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71"} Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.184904 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79df97d86b-4dr9p" event={"ID":"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880","Type":"ContainerStarted","Data":"810cf49c727ccdc8edba27adcde5c2b23afc22106a41aa5c20515a04f077ebae"} Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.188091 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerStarted","Data":"170abb84be82d97c90bb1e36346b7122190618e65add8272765ea09c84996d53"} Dec 03 14:27:05 crc kubenswrapper[5004]: W1203 14:27:05.208084 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1690b697_2b2a_4c83_8494_b5e525a414ea.slice/crio-3ccf3f197beda4a197c54d9872c91d7a8a245e26aac4528a75dfc9dcd52116cc WatchSource:0}: Error finding container 3ccf3f197beda4a197c54d9872c91d7a8a245e26aac4528a75dfc9dcd52116cc: Status 404 returned error can't find the container with id 3ccf3f197beda4a197c54d9872c91d7a8a245e26aac4528a75dfc9dcd52116cc Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.226466 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8csk5" podStartSLOduration=5.973947001 podStartE2EDuration="39.226430083s" podCreationTimestamp="2025-12-03 14:26:26 +0000 UTC" firstStartedPulling="2025-12-03 14:26:27.704352729 +0000 UTC m=+1200.453322965" lastFinishedPulling="2025-12-03 14:27:00.956835801 +0000 UTC m=+1233.705806047" observedRunningTime="2025-12-03 14:27:05.205047132 +0000 UTC m=+1237.954017368" watchObservedRunningTime="2025-12-03 14:27:05.226430083 +0000 UTC m=+1237.975400319" Dec 03 14:27:05 crc kubenswrapper[5004]: I1203 14:27:05.244199 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xpj88" podStartSLOduration=13.24415988 podStartE2EDuration="13.24415988s" podCreationTimestamp="2025-12-03 14:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:05.231533159 +0000 UTC m=+1237.980503395" watchObservedRunningTime="2025-12-03 14:27:05.24415988 +0000 UTC m=+1237.993130116" Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.199680 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79df97d86b-4dr9p" event={"ID":"e0f1c734-5c6e-4f15-8f11-1e3c1da2d880","Type":"ContainerStarted","Data":"0e65faea8cc2ce9d91f7c32e02c46132f72edd55400b19a57adecffa2c7b89da"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.206992 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerStarted","Data":"a6a322c31ab3f0c4562187a886610d52367b6408ee0c2e18e7e58d3f499d53ce"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.207029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerStarted","Data":"3ccf3f197beda4a197c54d9872c91d7a8a245e26aac4528a75dfc9dcd52116cc"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.220520 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerStarted","Data":"85251560e3d53350eb3d01e94ef00b8672f076f3cfd3f4a9681e204313af876e"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.225728 5004 generic.go:334] "Generic (PLEG): container finished" podID="84d38e9d-5aea-4c66-8c17-cc31d9494116" containerID="7868aa0b8066bfecd1cb2ca83f6c3f89c7491195180e8311bc7539b3977d899e" exitCode=0 Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.225815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c9xfn" event={"ID":"84d38e9d-5aea-4c66-8c17-cc31d9494116","Type":"ContainerDied","Data":"7868aa0b8066bfecd1cb2ca83f6c3f89c7491195180e8311bc7539b3977d899e"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.231116 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79df97d86b-4dr9p" podStartSLOduration=26.866149896 podStartE2EDuration="29.231100196s" podCreationTimestamp="2025-12-03 14:26:37 +0000 UTC" firstStartedPulling="2025-12-03 14:27:01.92812791 +0000 UTC m=+1234.677098146" lastFinishedPulling="2025-12-03 14:27:04.29307821 +0000 UTC m=+1237.042048446" observedRunningTime="2025-12-03 14:27:06.229464859 +0000 UTC m=+1238.978435105" watchObservedRunningTime="2025-12-03 14:27:06.231100196 +0000 UTC m=+1238.980070432" Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.233601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerStarted","Data":"554a30263a830941569cbb5d0572bdb44c33ff59ddd7f9891f94c991971a9a25"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.233644 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerStarted","Data":"7002bfac4aebf05c2adccb4104c283116bdf8849a79cbba847009be002162ca7"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.236188 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerStarted","Data":"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f"} Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.309707 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-587bd47d68-c6stc" podStartSLOduration=28.377410079 podStartE2EDuration="29.309683682s" podCreationTimestamp="2025-12-03 14:26:37 +0000 UTC" firstStartedPulling="2025-12-03 14:27:03.515933962 +0000 UTC m=+1236.264904238" lastFinishedPulling="2025-12-03 14:27:04.448207605 +0000 UTC m=+1237.197177841" observedRunningTime="2025-12-03 14:27:06.303377722 +0000 UTC m=+1239.052347958" watchObservedRunningTime="2025-12-03 14:27:06.309683682 +0000 UTC m=+1239.058653918" Dec 03 14:27:06 crc kubenswrapper[5004]: I1203 14:27:06.348445 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.34841666 podStartE2EDuration="14.34841666s" podCreationTimestamp="2025-12-03 14:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:06.348030609 +0000 UTC m=+1239.097000845" watchObservedRunningTime="2025-12-03 14:27:06.34841666 +0000 UTC m=+1239.097386896" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.285499 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerStarted","Data":"bbfe6de4298b97ef3f233b199b3c250e82ffbff3f19f7ef17d2c2c8f966c7c05"} Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.696621 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.720077 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.720053952 podStartE2EDuration="5.720053952s" podCreationTimestamp="2025-12-03 14:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:07.310831033 +0000 UTC m=+1240.059801279" watchObservedRunningTime="2025-12-03 14:27:07.720053952 +0000 UTC m=+1240.469024208" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.786528 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config\") pod \"84d38e9d-5aea-4c66-8c17-cc31d9494116\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.786679 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmts\" (UniqueName: \"kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts\") pod \"84d38e9d-5aea-4c66-8c17-cc31d9494116\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.786761 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle\") pod \"84d38e9d-5aea-4c66-8c17-cc31d9494116\" (UID: \"84d38e9d-5aea-4c66-8c17-cc31d9494116\") " Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.814537 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts" (OuterVolumeSpecName: "kube-api-access-gtmts") pod "84d38e9d-5aea-4c66-8c17-cc31d9494116" (UID: "84d38e9d-5aea-4c66-8c17-cc31d9494116"). InnerVolumeSpecName "kube-api-access-gtmts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.817144 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config" (OuterVolumeSpecName: "config") pod "84d38e9d-5aea-4c66-8c17-cc31d9494116" (UID: "84d38e9d-5aea-4c66-8c17-cc31d9494116"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.818019 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84d38e9d-5aea-4c66-8c17-cc31d9494116" (UID: "84d38e9d-5aea-4c66-8c17-cc31d9494116"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.889183 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.889219 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/84d38e9d-5aea-4c66-8c17-cc31d9494116-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:07 crc kubenswrapper[5004]: I1203 14:27:07.889233 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmts\" (UniqueName: \"kubernetes.io/projected/84d38e9d-5aea-4c66-8c17-cc31d9494116-kube-api-access-gtmts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.175313 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.175391 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.357668 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c9xfn" event={"ID":"84d38e9d-5aea-4c66-8c17-cc31d9494116","Type":"ContainerDied","Data":"44395539fd8de7450c7049e999668c5cc194baee6ea5495bf38dc0753002de43"} Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.357716 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44395539fd8de7450c7049e999668c5cc194baee6ea5495bf38dc0753002de43" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.357790 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c9xfn" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.389256 5004 generic.go:334] "Generic (PLEG): container finished" podID="7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" containerID="29531f1ce2a7659a3479164bbf9895db81399c25c170f8e20b074c4f392b167c" exitCode=0 Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.390771 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8csk5" event={"ID":"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6","Type":"ContainerDied","Data":"29531f1ce2a7659a3479164bbf9895db81399c25c170f8e20b074c4f392b167c"} Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.409758 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.410918 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.662923 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:08 crc kubenswrapper[5004]: E1203 14:27:08.663278 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d38e9d-5aea-4c66-8c17-cc31d9494116" containerName="neutron-db-sync" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.663296 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d38e9d-5aea-4c66-8c17-cc31d9494116" containerName="neutron-db-sync" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.663520 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d38e9d-5aea-4c66-8c17-cc31d9494116" containerName="neutron-db-sync" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.664421 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.696385 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.744405 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.749560 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.754498 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.755902 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.756176 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xt67s" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.756406 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.763426 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.832583 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.832662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.833563 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.833659 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.833752 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fts6k\" (UniqueName: \"kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.833816 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935158 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935247 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935277 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxz9\" (UniqueName: \"kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935309 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935333 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935362 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935479 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fts6k\" (UniqueName: \"kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935552 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935593 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.935616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.936587 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.936684 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.936696 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.937339 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.937392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:08 crc kubenswrapper[5004]: I1203 14:27:08.959799 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fts6k\" (UniqueName: \"kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k\") pod \"dnsmasq-dns-55f844cf75-89jhk\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.020246 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.036949 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.037222 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.037532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.037693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxz9\" (UniqueName: \"kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.037787 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.042614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.048783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.053567 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.057330 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.067626 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxz9\" (UniqueName: \"kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9\") pod \"neutron-67457f9876-l5kn6\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.092527 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.457693 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.790339 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8csk5" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.862715 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp4ds\" (UniqueName: \"kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds\") pod \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.862817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs\") pod \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.862944 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts\") pod \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.862980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle\") pod \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.863043 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data\") pod \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\" (UID: \"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6\") " Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.878183 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs" (OuterVolumeSpecName: "logs") pod "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" (UID: "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.879336 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds" (OuterVolumeSpecName: "kube-api-access-qp4ds") pod "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" (UID: "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6"). InnerVolumeSpecName "kube-api-access-qp4ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.890808 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data" (OuterVolumeSpecName: "config-data") pod "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" (UID: "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.891041 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts" (OuterVolumeSpecName: "scripts") pod "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" (UID: "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.927205 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" (UID: "7eb30bfb-060c-4c8f-aab0-6ca0befc83d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.928762 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.985057 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.985583 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.985601 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.985616 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp4ds\" (UniqueName: \"kubernetes.io/projected/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-kube-api-access-qp4ds\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:09 crc kubenswrapper[5004]: I1203 14:27:09.985630 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.439554 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8csk5" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.440298 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8csk5" event={"ID":"7eb30bfb-060c-4c8f-aab0-6ca0befc83d6","Type":"ContainerDied","Data":"eaed4a63d81a75ec22cf77d62ba4c382d24394e714be015e01b342ebf4f75436"} Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.440330 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaed4a63d81a75ec22cf77d62ba4c382d24394e714be015e01b342ebf4f75436" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.447345 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerStarted","Data":"b24ad8325c3b9ffbd2d5b5b2a5eac916da8124e900ce18e3c67a59edfcff8578"} Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.449243 5004 generic.go:334] "Generic (PLEG): container finished" podID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerID="4d5d8db4142cd1542e6c685d26ecb00f2160b99c572036ea261da4c5a95253d9" exitCode=0 Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.449289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" event={"ID":"928b00e0-b667-4430-9bd3-2423bf037d6e","Type":"ContainerDied","Data":"4d5d8db4142cd1542e6c685d26ecb00f2160b99c572036ea261da4c5a95253d9"} Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.449316 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" event={"ID":"928b00e0-b667-4430-9bd3-2423bf037d6e","Type":"ContainerStarted","Data":"e0a79a2748bcd02709c867e84e56ce0aabcd35d18c2acac4fcfd9be05d769e8a"} Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.616397 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ccd4cc976-4jrqc"] Dec 03 14:27:10 crc kubenswrapper[5004]: E1203 14:27:10.616883 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" containerName="placement-db-sync" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.616900 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" containerName="placement-db-sync" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.617158 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" containerName="placement-db-sync" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.628692 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ccd4cc976-4jrqc"] Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.629001 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.632515 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7pj4f" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.632552 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.632770 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.632984 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.637851 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695403 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-public-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695460 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-config-data\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695484 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-logs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695528 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-combined-ca-bundle\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695547 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-scripts\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695589 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghftv\" (UniqueName: \"kubernetes.io/projected/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-kube-api-access-ghftv\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.695603 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-internal-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.796560 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-public-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.796935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-config-data\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.796962 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-logs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.796980 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-combined-ca-bundle\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.797001 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-scripts\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.797027 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-internal-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.797049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghftv\" (UniqueName: \"kubernetes.io/projected/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-kube-api-access-ghftv\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.797439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-logs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.801667 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-public-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.802015 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-internal-tls-certs\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.805262 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-config-data\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.806349 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-combined-ca-bundle\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.808202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-scripts\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.828641 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghftv\" (UniqueName: \"kubernetes.io/projected/a6f2bf21-eade-495e-99bb-4d12b3c46c3b-kube-api-access-ghftv\") pod \"placement-ccd4cc976-4jrqc\" (UID: \"a6f2bf21-eade-495e-99bb-4d12b3c46c3b\") " pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:10 crc kubenswrapper[5004]: I1203 14:27:10.953349 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.053322 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66456bfc4f-v6lrf"] Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.055087 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.059440 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.059937 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.079888 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66456bfc4f-v6lrf"] Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.212775 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-ovndb-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.212924 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-public-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.212996 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-httpd-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.213069 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-internal-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.213104 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.213139 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgwzw\" (UniqueName: \"kubernetes.io/projected/5c21b585-fe01-4f87-9a60-1df17f266659-kube-api-access-dgwzw\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.213199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-combined-ca-bundle\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.314940 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-public-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-httpd-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315437 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-internal-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315560 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315668 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwzw\" (UniqueName: \"kubernetes.io/projected/5c21b585-fe01-4f87-9a60-1df17f266659-kube-api-access-dgwzw\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315773 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-combined-ca-bundle\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.315889 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-ovndb-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.324763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-httpd-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.328668 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-combined-ca-bundle\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.339632 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-public-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.340436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-ovndb-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.359946 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-config\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.362992 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c21b585-fe01-4f87-9a60-1df17f266659-internal-tls-certs\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.364959 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwzw\" (UniqueName: \"kubernetes.io/projected/5c21b585-fe01-4f87-9a60-1df17f266659-kube-api-access-dgwzw\") pod \"neutron-66456bfc4f-v6lrf\" (UID: \"5c21b585-fe01-4f87-9a60-1df17f266659\") " pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.382061 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.470442 5004 generic.go:334] "Generic (PLEG): container finished" podID="124d03f5-14a3-430b-acb8-4a2b2fb79d37" containerID="d2b99edabb08722467a07ede584711b4db2a567eb0c271df6d45257baa7fcb24" exitCode=0 Dec 03 14:27:11 crc kubenswrapper[5004]: I1203 14:27:11.470495 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpj88" event={"ID":"124d03f5-14a3-430b-acb8-4a2b2fb79d37","Type":"ContainerDied","Data":"d2b99edabb08722467a07ede584711b4db2a567eb0c271df6d45257baa7fcb24"} Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.432530 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.432960 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.470732 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.478360 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.493500 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.870027 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.870075 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.918570 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:12 crc kubenswrapper[5004]: I1203 14:27:12.920605 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:13 crc kubenswrapper[5004]: I1203 14:27:13.492215 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:13 crc kubenswrapper[5004]: I1203 14:27:13.492273 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:27:13 crc kubenswrapper[5004]: I1203 14:27:13.492506 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:14 crc kubenswrapper[5004]: I1203 14:27:14.501019 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:27:15 crc kubenswrapper[5004]: I1203 14:27:15.070665 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:27:15 crc kubenswrapper[5004]: I1203 14:27:15.512433 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:27:15 crc kubenswrapper[5004]: I1203 14:27:15.515572 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:27:15 crc kubenswrapper[5004]: I1203 14:27:15.512450 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:27:15 crc kubenswrapper[5004]: I1203 14:27:15.975968 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.062572 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.111877 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.111945 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.111977 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.112314 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.112373 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.112499 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6nv\" (UniqueName: \"kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv\") pod \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\" (UID: \"124d03f5-14a3-430b-acb8-4a2b2fb79d37\") " Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.145824 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.149478 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.164010 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts" (OuterVolumeSpecName: "scripts") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.166280 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv" (OuterVolumeSpecName: "kube-api-access-tv6nv") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "kube-api-access-tv6nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.167739 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.184282 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.217633 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6nv\" (UniqueName: \"kubernetes.io/projected/124d03f5-14a3-430b-acb8-4a2b2fb79d37-kube-api-access-tv6nv\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.217672 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.217684 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.217693 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.217705 5004 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.293924 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data" (OuterVolumeSpecName: "config-data") pod "124d03f5-14a3-430b-acb8-4a2b2fb79d37" (UID: "124d03f5-14a3-430b-acb8-4a2b2fb79d37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.319293 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/124d03f5-14a3-430b-acb8-4a2b2fb79d37-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.385213 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.563137 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerStarted","Data":"dd591f0c931b6816b155aec5b3c908f746848a520186c68192a606ea930fb2ac"} Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.569075 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ccd4cc976-4jrqc"] Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.574136 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpj88" event={"ID":"124d03f5-14a3-430b-acb8-4a2b2fb79d37","Type":"ContainerDied","Data":"73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71"} Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.574201 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d3f0124d8c83c1106ea31ad310e97ca70cc427ef78d05cc49e26834187de71" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.574341 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpj88" Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.580939 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" event={"ID":"928b00e0-b667-4430-9bd3-2423bf037d6e","Type":"ContainerStarted","Data":"5d97190441b12e18ba6b219a7b4fcf6e8f6e174ece11a627f7a20ce5b56993a0"} Dec 03 14:27:16 crc kubenswrapper[5004]: I1203 14:27:16.603988 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66456bfc4f-v6lrf"] Dec 03 14:27:16 crc kubenswrapper[5004]: W1203 14:27:16.612132 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c21b585_fe01_4f87_9a60_1df17f266659.slice/crio-4b0ba1c056e9ebc8796a917d2eb89957de149aa61ae3543108bd2fe92a23912b WatchSource:0}: Error finding container 4b0ba1c056e9ebc8796a917d2eb89957de149aa61ae3543108bd2fe92a23912b: Status 404 returned error can't find the container with id 4b0ba1c056e9ebc8796a917d2eb89957de149aa61ae3543108bd2fe92a23912b Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.151189 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" podStartSLOduration=9.151162755 podStartE2EDuration="9.151162755s" podCreationTimestamp="2025-12-03 14:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:16.611194298 +0000 UTC m=+1249.360164554" watchObservedRunningTime="2025-12-03 14:27:17.151162755 +0000 UTC m=+1249.900132991" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.287798 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65f67fcd5d-5p75z"] Dec 03 14:27:17 crc kubenswrapper[5004]: E1203 14:27:17.288430 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124d03f5-14a3-430b-acb8-4a2b2fb79d37" containerName="keystone-bootstrap" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.288444 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="124d03f5-14a3-430b-acb8-4a2b2fb79d37" containerName="keystone-bootstrap" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.288621 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="124d03f5-14a3-430b-acb8-4a2b2fb79d37" containerName="keystone-bootstrap" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.289256 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.296688 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.296927 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4rh8q" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.297055 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.297251 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.297365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.297423 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.299141 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65f67fcd5d-5p75z"] Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.342902 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-internal-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343021 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92h57\" (UniqueName: \"kubernetes.io/projected/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-kube-api-access-92h57\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343053 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-scripts\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343089 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-public-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343109 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-combined-ca-bundle\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343134 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-credential-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343150 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-fernet-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.343165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-config-data\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.445948 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92h57\" (UniqueName: \"kubernetes.io/projected/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-kube-api-access-92h57\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446018 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-scripts\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446050 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-public-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446066 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-combined-ca-bundle\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446094 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-credential-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446110 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-fernet-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446124 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-config-data\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.446182 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-internal-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.451577 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-combined-ca-bundle\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.456436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-internal-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.460265 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-credential-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.464764 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-fernet-keys\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.465227 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-scripts\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.465705 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-config-data\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.468927 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-public-tls-certs\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.473341 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92h57\" (UniqueName: \"kubernetes.io/projected/7efdab5a-a074-4ce4-bcc0-b2b8481b886c-kube-api-access-92h57\") pod \"keystone-65f67fcd5d-5p75z\" (UID: \"7efdab5a-a074-4ce4-bcc0-b2b8481b886c\") " pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.597405 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerStarted","Data":"9df7744ac7308982f5849dadf0c0067607f2893d9d53424d70637e024304f2cb"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.597984 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.600782 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wv94f" event={"ID":"82059d63-43a0-43ed-b9ea-9c54f700a2dc","Type":"ContainerStarted","Data":"e3762ab41b089bbb2b654cb032992671bc7c5239579143c08ef4e7476721e67f"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.605980 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ccd4cc976-4jrqc" event={"ID":"a6f2bf21-eade-495e-99bb-4d12b3c46c3b","Type":"ContainerStarted","Data":"e8dc4330491e93262f0c86abb49bdd6f8480cd9673849019c81bb123ec580ac1"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.606011 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ccd4cc976-4jrqc" event={"ID":"a6f2bf21-eade-495e-99bb-4d12b3c46c3b","Type":"ContainerStarted","Data":"fc7db91b1ce04a50ad010c5eb9dec40030f3a1cd8f42a53611d7745c5a3b0a39"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.606024 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ccd4cc976-4jrqc" event={"ID":"a6f2bf21-eade-495e-99bb-4d12b3c46c3b","Type":"ContainerStarted","Data":"6acc35a5e53331f728b2e80d91700f0f288f300539e03ea04fa5b7c7f4b71827"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.606126 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.606169 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.611077 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66456bfc4f-v6lrf" event={"ID":"5c21b585-fe01-4f87-9a60-1df17f266659","Type":"ContainerStarted","Data":"326bf9065f60a42abfda0839ffac4a627871fca4b1a887c9457623cb6730e718"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.611114 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66456bfc4f-v6lrf" event={"ID":"5c21b585-fe01-4f87-9a60-1df17f266659","Type":"ContainerStarted","Data":"badb5be259b2b2200391fcaf6d16113aed2e810663afd36317e247909b3e6c5c"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.611125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66456bfc4f-v6lrf" event={"ID":"5c21b585-fe01-4f87-9a60-1df17f266659","Type":"ContainerStarted","Data":"4b0ba1c056e9ebc8796a917d2eb89957de149aa61ae3543108bd2fe92a23912b"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.611242 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.618996 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.641239 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67457f9876-l5kn6" podStartSLOduration=9.641217594 podStartE2EDuration="9.641217594s" podCreationTimestamp="2025-12-03 14:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:17.633140553 +0000 UTC m=+1250.382110799" watchObservedRunningTime="2025-12-03 14:27:17.641217594 +0000 UTC m=+1250.390187820" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.644480 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerStarted","Data":"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f"} Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.644530 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.694907 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66456bfc4f-v6lrf" podStartSLOduration=7.694889398 podStartE2EDuration="7.694889398s" podCreationTimestamp="2025-12-03 14:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:17.691522742 +0000 UTC m=+1250.440492968" watchObservedRunningTime="2025-12-03 14:27:17.694889398 +0000 UTC m=+1250.443859624" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.753917 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wv94f" podStartSLOduration=3.948686561 podStartE2EDuration="52.753895355s" podCreationTimestamp="2025-12-03 14:26:25 +0000 UTC" firstStartedPulling="2025-12-03 14:26:27.514152111 +0000 UTC m=+1200.263122347" lastFinishedPulling="2025-12-03 14:27:16.319360915 +0000 UTC m=+1249.068331141" observedRunningTime="2025-12-03 14:27:17.721946442 +0000 UTC m=+1250.470916678" watchObservedRunningTime="2025-12-03 14:27:17.753895355 +0000 UTC m=+1250.502865601" Dec 03 14:27:17 crc kubenswrapper[5004]: I1203 14:27:17.755140 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ccd4cc976-4jrqc" podStartSLOduration=7.755131001 podStartE2EDuration="7.755131001s" podCreationTimestamp="2025-12-03 14:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:17.749673665 +0000 UTC m=+1250.498643911" watchObservedRunningTime="2025-12-03 14:27:17.755131001 +0000 UTC m=+1250.504101237" Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.191032 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.303697 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65f67fcd5d-5p75z"] Dec 03 14:27:18 crc kubenswrapper[5004]: W1203 14:27:18.322561 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7efdab5a_a074_4ce4_bcc0_b2b8481b886c.slice/crio-d014d2cec50a75bf81e53182f45129e091ee5f62937e837053ffc2ca72f2e534 WatchSource:0}: Error finding container d014d2cec50a75bf81e53182f45129e091ee5f62937e837053ffc2ca72f2e534: Status 404 returned error can't find the container with id d014d2cec50a75bf81e53182f45129e091ee5f62937e837053ffc2ca72f2e534 Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.415532 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79df97d86b-4dr9p" podUID="e0f1c734-5c6e-4f15-8f11-1e3c1da2d880" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.638277 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-646bh" event={"ID":"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52","Type":"ContainerStarted","Data":"3a2d6e5ffe7db56bc7dbd8e07419385f55f9b50e5496296af21e05436f870d95"} Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.648578 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65f67fcd5d-5p75z" event={"ID":"7efdab5a-a074-4ce4-bcc0-b2b8481b886c","Type":"ContainerStarted","Data":"d014d2cec50a75bf81e53182f45129e091ee5f62937e837053ffc2ca72f2e534"} Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.649831 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.682005 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-646bh" podStartSLOduration=2.984682833 podStartE2EDuration="53.681985938s" podCreationTimestamp="2025-12-03 14:26:25 +0000 UTC" firstStartedPulling="2025-12-03 14:26:27.470632497 +0000 UTC m=+1200.219602733" lastFinishedPulling="2025-12-03 14:27:18.167935602 +0000 UTC m=+1250.916905838" observedRunningTime="2025-12-03 14:27:18.672736024 +0000 UTC m=+1251.421706270" watchObservedRunningTime="2025-12-03 14:27:18.681985938 +0000 UTC m=+1251.430956174" Dec 03 14:27:18 crc kubenswrapper[5004]: I1203 14:27:18.702571 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-65f67fcd5d-5p75z" podStartSLOduration=1.7025479159999999 podStartE2EDuration="1.702547916s" podCreationTimestamp="2025-12-03 14:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:18.701171277 +0000 UTC m=+1251.450141513" watchObservedRunningTime="2025-12-03 14:27:18.702547916 +0000 UTC m=+1251.451518152" Dec 03 14:27:19 crc kubenswrapper[5004]: I1203 14:27:19.664619 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65f67fcd5d-5p75z" event={"ID":"7efdab5a-a074-4ce4-bcc0-b2b8481b886c","Type":"ContainerStarted","Data":"84cf6cf0d4816e57150ef401916a922994afa0c24a9bf1caa4ad441f1366f5c5"} Dec 03 14:27:22 crc kubenswrapper[5004]: I1203 14:27:22.824386 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:27:22 crc kubenswrapper[5004]: I1203 14:27:22.824461 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:27:22 crc kubenswrapper[5004]: I1203 14:27:22.824511 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:27:22 crc kubenswrapper[5004]: I1203 14:27:22.825220 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:27:22 crc kubenswrapper[5004]: I1203 14:27:22.825274 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a" gracePeriod=600 Dec 03 14:27:23 crc kubenswrapper[5004]: I1203 14:27:23.709106 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a" exitCode=0 Dec 03 14:27:23 crc kubenswrapper[5004]: I1203 14:27:23.709150 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a"} Dec 03 14:27:23 crc kubenswrapper[5004]: I1203 14:27:23.709183 5004 scope.go:117] "RemoveContainer" containerID="e9f4b0a50cae7dcdbb79ad537159bcb90f3cc3c38fc2a61c36a3aa3d7865f7d6" Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.022007 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.100321 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.100545 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="dnsmasq-dns" containerID="cri-o://f108915e52d7dde700f8ca6ec043e4a50c3383cd60c414f632df9f8fab1b30ad" gracePeriod=10 Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.719522 5004 generic.go:334] "Generic (PLEG): container finished" podID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" containerID="3a2d6e5ffe7db56bc7dbd8e07419385f55f9b50e5496296af21e05436f870d95" exitCode=0 Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.719611 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-646bh" event={"ID":"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52","Type":"ContainerDied","Data":"3a2d6e5ffe7db56bc7dbd8e07419385f55f9b50e5496296af21e05436f870d95"} Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.723033 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerID="f108915e52d7dde700f8ca6ec043e4a50c3383cd60c414f632df9f8fab1b30ad" exitCode=0 Dec 03 14:27:24 crc kubenswrapper[5004]: I1203 14:27:24.723069 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" event={"ID":"63d10fd7-16ee-4670-8b57-b2cf118f7530","Type":"ContainerDied","Data":"f108915e52d7dde700f8ca6ec043e4a50c3383cd60c414f632df9f8fab1b30ad"} Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.144078 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-646bh" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.227195 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle\") pod \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.227284 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data\") pod \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.227338 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6xs\" (UniqueName: \"kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs\") pod \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\" (UID: \"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.246127 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" (UID: "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.251705 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs" (OuterVolumeSpecName: "kube-api-access-fg6xs") pod "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" (UID: "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52"). InnerVolumeSpecName "kube-api-access-fg6xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.297410 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" (UID: "c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.329057 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.329104 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.329114 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6xs\" (UniqueName: \"kubernetes.io/projected/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52-kube-api-access-fg6xs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.698630 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.753407 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-646bh" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.753517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-646bh" event={"ID":"c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52","Type":"ContainerDied","Data":"4406f065d05c6775ca615bf6c0c332e4274964e4b9d6c1ed60edede50c023bab"} Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.753557 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4406f065d05c6775ca615bf6c0c332e4274964e4b9d6c1ed60edede50c023bab" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.780351 5004 generic.go:334] "Generic (PLEG): container finished" podID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" containerID="e3762ab41b089bbb2b654cb032992671bc7c5239579143c08ef4e7476721e67f" exitCode=0 Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.780492 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wv94f" event={"ID":"82059d63-43a0-43ed-b9ea-9c54f700a2dc","Type":"ContainerDied","Data":"e3762ab41b089bbb2b654cb032992671bc7c5239579143c08ef4e7476721e67f"} Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.785698 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" event={"ID":"63d10fd7-16ee-4670-8b57-b2cf118f7530","Type":"ContainerDied","Data":"5f12717a7c89926a9e039d71bb60f754d3e7f541dc43527457e4ad1e0b1785a8"} Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.785758 5004 scope.go:117] "RemoveContainer" containerID="f108915e52d7dde700f8ca6ec043e4a50c3383cd60c414f632df9f8fab1b30ad" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.785819 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9rkv4" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.821939 5004 scope.go:117] "RemoveContainer" containerID="8c1a093e281e3599558b4236f645d67964df02fcf656556ed573088d2ea345cc" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.839636 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.840068 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd975\" (UniqueName: \"kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.840099 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.840333 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.840416 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.840459 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb\") pod \"63d10fd7-16ee-4670-8b57-b2cf118f7530\" (UID: \"63d10fd7-16ee-4670-8b57-b2cf118f7530\") " Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.844090 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975" (OuterVolumeSpecName: "kube-api-access-wd975") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "kube-api-access-wd975". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.896729 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.900320 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.902538 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.907215 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.910141 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config" (OuterVolumeSpecName: "config") pod "63d10fd7-16ee-4670-8b57-b2cf118f7530" (UID: "63d10fd7-16ee-4670-8b57-b2cf118f7530"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943422 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943461 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd975\" (UniqueName: \"kubernetes.io/projected/63d10fd7-16ee-4670-8b57-b2cf118f7530-kube-api-access-wd975\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943473 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943482 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943493 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:26 crc kubenswrapper[5004]: I1203 14:27:26.943504 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63d10fd7-16ee-4670-8b57-b2cf118f7530-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029097 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56979bf587-swwll"] Dec 03 14:27:27 crc kubenswrapper[5004]: E1203 14:27:27.029478 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" containerName="barbican-db-sync" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029495 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" containerName="barbican-db-sync" Dec 03 14:27:27 crc kubenswrapper[5004]: E1203 14:27:27.029508 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="dnsmasq-dns" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029515 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="dnsmasq-dns" Dec 03 14:27:27 crc kubenswrapper[5004]: E1203 14:27:27.029540 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="init" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029546 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="init" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029707 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" containerName="barbican-db-sync" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.029734 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" containerName="dnsmasq-dns" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.030676 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.041119 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.045375 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.072929 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c799f8fcd-gg559"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.074765 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.092277 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n8ncc" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.092580 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.101715 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56979bf587-swwll"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.134921 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c799f8fcd-gg559"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.148824 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data-custom\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.148912 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.148974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.148995 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-combined-ca-bundle\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149043 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data-custom\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149061 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrhd\" (UniqueName: \"kubernetes.io/projected/c4501cb8-8287-4e9d-83b2-858fcb7c431c-kube-api-access-6zrhd\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149086 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033d65ef-e917-445c-9c56-cffb8b328dbf-logs\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149115 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149144 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4501cb8-8287-4e9d-83b2-858fcb7c431c-logs\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.149165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5z7\" (UniqueName: \"kubernetes.io/projected/033d65ef-e917-445c-9c56-cffb8b328dbf-kube-api-access-kw5z7\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.253760 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data-custom\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254161 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254212 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-combined-ca-bundle\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data-custom\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254294 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrhd\" (UniqueName: \"kubernetes.io/projected/c4501cb8-8287-4e9d-83b2-858fcb7c431c-kube-api-access-6zrhd\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254314 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033d65ef-e917-445c-9c56-cffb8b328dbf-logs\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254342 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254372 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4501cb8-8287-4e9d-83b2-858fcb7c431c-logs\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.254396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5z7\" (UniqueName: \"kubernetes.io/projected/033d65ef-e917-445c-9c56-cffb8b328dbf-kube-api-access-kw5z7\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.274960 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.276437 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033d65ef-e917-445c-9c56-cffb8b328dbf-logs\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.276838 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.278608 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4501cb8-8287-4e9d-83b2-858fcb7c431c-logs\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.284513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.285897 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-combined-ca-bundle\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.286432 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data-custom\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.287356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-combined-ca-bundle\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.298624 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4501cb8-8287-4e9d-83b2-858fcb7c431c-config-data\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.312444 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.317582 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrhd\" (UniqueName: \"kubernetes.io/projected/c4501cb8-8287-4e9d-83b2-858fcb7c431c-kube-api-access-6zrhd\") pod \"barbican-keystone-listener-7c799f8fcd-gg559\" (UID: \"c4501cb8-8287-4e9d-83b2-858fcb7c431c\") " pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.324419 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/033d65ef-e917-445c-9c56-cffb8b328dbf-config-data-custom\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.326487 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5z7\" (UniqueName: \"kubernetes.io/projected/033d65ef-e917-445c-9c56-cffb8b328dbf-kube-api-access-kw5z7\") pod \"barbican-worker-56979bf587-swwll\" (UID: \"033d65ef-e917-445c-9c56-cffb8b328dbf\") " pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359372 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359394 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgv4\" (UniqueName: \"kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359416 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359435 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.359455 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.365921 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.405917 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9rkv4"] Dec 03 14:27:27 crc kubenswrapper[5004]: E1203 14:27:27.430655 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.440924 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.442488 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.446578 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.449723 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.462898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.462965 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.462989 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgv4\" (UniqueName: \"kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.463017 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.463037 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.463056 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.464416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.464641 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.464918 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.465177 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.465470 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.467215 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56979bf587-swwll" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.484562 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgv4\" (UniqueName: \"kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4\") pod \"dnsmasq-dns-85ff748b95-fw4gz\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.500322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.565984 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.566038 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.566096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.566134 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w4cj\" (UniqueName: \"kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.566214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.670207 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.670266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.673840 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.673942 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w4cj\" (UniqueName: \"kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.677304 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.677625 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.686146 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.701781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.702325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.712649 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.726240 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w4cj\" (UniqueName: \"kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj\") pod \"barbican-api-578849f58d-fwczz\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.735721 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d10fd7-16ee-4670-8b57-b2cf118f7530" path="/var/lib/kubelet/pods/63d10fd7-16ee-4670-8b57-b2cf118f7530/volumes" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.736571 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.769306 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.818025 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d"} Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.840652 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="ceilometer-notification-agent" containerID="cri-o://a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f" gracePeriod=30 Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.840929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerStarted","Data":"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051"} Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.840973 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.841201 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="proxy-httpd" containerID="cri-o://7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051" gracePeriod=30 Dec 03 14:27:27 crc kubenswrapper[5004]: I1203 14:27:27.841249 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="sg-core" containerID="cri-o://11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f" gracePeriod=30 Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.109013 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56979bf587-swwll"] Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.175473 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.310462 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wv94f" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.334880 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c799f8fcd-gg559"] Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398125 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398193 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgqh\" (UniqueName: \"kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398317 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398395 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398427 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts\") pod \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\" (UID: \"82059d63-43a0-43ed-b9ea-9c54f700a2dc\") " Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.398725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.401674 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82059d63-43a0-43ed-b9ea-9c54f700a2dc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.406317 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts" (OuterVolumeSpecName: "scripts") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.406545 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh" (OuterVolumeSpecName: "kube-api-access-zsgqh") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "kube-api-access-zsgqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.406718 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.409841 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79df97d86b-4dr9p" podUID="e0f1c734-5c6e-4f15-8f11-1e3c1da2d880" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.430831 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.449755 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data" (OuterVolumeSpecName: "config-data") pod "82059d63-43a0-43ed-b9ea-9c54f700a2dc" (UID: "82059d63-43a0-43ed-b9ea-9c54f700a2dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:28 crc kubenswrapper[5004]: W1203 14:27:28.497096 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70922bcf_bd0c_4eea_90e2_b49a02262d24.slice/crio-48268b34025eb0d90d694ead067b8d37fd1813177fe1e68bc435056a839c4381 WatchSource:0}: Error finding container 48268b34025eb0d90d694ead067b8d37fd1813177fe1e68bc435056a839c4381: Status 404 returned error can't find the container with id 48268b34025eb0d90d694ead067b8d37fd1813177fe1e68bc435056a839c4381 Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503153 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503195 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503243 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsgqh\" (UniqueName: \"kubernetes.io/projected/82059d63-43a0-43ed-b9ea-9c54f700a2dc-kube-api-access-zsgqh\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503261 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503272 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82059d63-43a0-43ed-b9ea-9c54f700a2dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.503154 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.510562 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.856306 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56979bf587-swwll" event={"ID":"033d65ef-e917-445c-9c56-cffb8b328dbf","Type":"ContainerStarted","Data":"b28e4af291860651d85125404634c31f8cffafdb75ec7aa9973a3440b07a0e92"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.858787 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wv94f" event={"ID":"82059d63-43a0-43ed-b9ea-9c54f700a2dc","Type":"ContainerDied","Data":"dfe60405dbb7ac61700ff0b03b8f19de13b7a02817cfb1d04e573f0ce65c041a"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.858816 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfe60405dbb7ac61700ff0b03b8f19de13b7a02817cfb1d04e573f0ce65c041a" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.858891 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wv94f" Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.871903 5004 generic.go:334] "Generic (PLEG): container finished" podID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerID="65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8" exitCode=0 Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.872136 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" event={"ID":"70922bcf-bd0c-4eea-90e2-b49a02262d24","Type":"ContainerDied","Data":"65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.872256 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" event={"ID":"70922bcf-bd0c-4eea-90e2-b49a02262d24","Type":"ContainerStarted","Data":"48268b34025eb0d90d694ead067b8d37fd1813177fe1e68bc435056a839c4381"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.897315 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerStarted","Data":"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.897793 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerStarted","Data":"40aac21c881dad59f0cfd27893741f48944a16d4bf0008701c4efe11e6dcbc82"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.911705 5004 generic.go:334] "Generic (PLEG): container finished" podID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerID="7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051" exitCode=0 Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.911740 5004 generic.go:334] "Generic (PLEG): container finished" podID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerID="11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f" exitCode=2 Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.911796 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerDied","Data":"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.911823 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerDied","Data":"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f"} Dec 03 14:27:28 crc kubenswrapper[5004]: I1203 14:27:28.918653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" event={"ID":"c4501cb8-8287-4e9d-83b2-858fcb7c431c","Type":"ContainerStarted","Data":"2160dfc14f25d43ed1da56c95a1758f1e94c86245de99ca5def6923f6703c987"} Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.190723 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:29 crc kubenswrapper[5004]: E1203 14:27:29.192622 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" containerName="cinder-db-sync" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.192652 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" containerName="cinder-db-sync" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.192850 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" containerName="cinder-db-sync" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.194150 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.221442 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.221552 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.221735 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kzb2h" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.221806 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.234468 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.240394 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.273966 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.275554 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.315727 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324466 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324586 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2q9\" (UniqueName: \"kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324614 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324632 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.324649 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.458090 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2q9\" (UniqueName: \"kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.458686 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.458727 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.458775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.458803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459030 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459064 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459092 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459138 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459235 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fz8r\" (UniqueName: \"kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459287 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.459416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.461325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.523010 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.532764 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.533535 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2q9\" (UniqueName: \"kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.547967 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.550513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.553929 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.559541 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562667 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.562778 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fz8r\" (UniqueName: \"kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.564045 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.564603 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.565137 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.565660 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.566186 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.570412 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.577216 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.608339 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.619050 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fz8r\" (UniqueName: \"kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r\") pod \"dnsmasq-dns-5c9776ccc5-z9mhl\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.633971 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664320 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwqg\" (UniqueName: \"kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664385 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664484 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664526 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664545 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664711 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.664780 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767074 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767554 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767642 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767729 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767838 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwqg\" (UniqueName: \"kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.767960 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.770395 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.773070 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.776478 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.778315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.790157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.790390 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.806049 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwqg\" (UniqueName: \"kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg\") pod \"cinder-api-0\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " pod="openstack/cinder-api-0" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.936283 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" event={"ID":"70922bcf-bd0c-4eea-90e2-b49a02262d24","Type":"ContainerStarted","Data":"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263"} Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.936445 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="dnsmasq-dns" containerID="cri-o://1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263" gracePeriod=10 Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.936485 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.942280 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerStarted","Data":"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e"} Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.942505 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.969050 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" podStartSLOduration=2.968996639 podStartE2EDuration="2.968996639s" podCreationTimestamp="2025-12-03 14:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:29.95960227 +0000 UTC m=+1262.708572506" watchObservedRunningTime="2025-12-03 14:27:29.968996639 +0000 UTC m=+1262.717966875" Dec 03 14:27:29 crc kubenswrapper[5004]: I1203 14:27:29.985478 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-578849f58d-fwczz" podStartSLOduration=2.985449699 podStartE2EDuration="2.985449699s" podCreationTimestamp="2025-12-03 14:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:29.98023811 +0000 UTC m=+1262.729208376" watchObservedRunningTime="2025-12-03 14:27:29.985449699 +0000 UTC m=+1262.734419935" Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.038197 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.130655 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.248746 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:27:30 crc kubenswrapper[5004]: W1203 14:27:30.371023 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod690e6780_2a59_47d5_8485_6ca1f13cb0de.slice/crio-a8d30f44c9d753ab5d4cb4c0050f70f158490350cefb321244ee49c24eee4e6a WatchSource:0}: Error finding container a8d30f44c9d753ab5d4cb4c0050f70f158490350cefb321244ee49c24eee4e6a: Status 404 returned error can't find the container with id a8d30f44c9d753ab5d4cb4c0050f70f158490350cefb321244ee49c24eee4e6a Dec 03 14:27:30 crc kubenswrapper[5004]: W1203 14:27:30.374099 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d27d2d8_33ae_4e15_bd5f_ab671d40bfc0.slice/crio-ee17711545a86ae1bdc5ea53dd770b67914850ab9618085b8d8d6913d5018ab5 WatchSource:0}: Error finding container ee17711545a86ae1bdc5ea53dd770b67914850ab9618085b8d8d6913d5018ab5: Status 404 returned error can't find the container with id ee17711545a86ae1bdc5ea53dd770b67914850ab9618085b8d8d6913d5018ab5 Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.842523 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.981758 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerStarted","Data":"a8d30f44c9d753ab5d4cb4c0050f70f158490350cefb321244ee49c24eee4e6a"} Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.985482 5004 generic.go:334] "Generic (PLEG): container finished" podID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerID="1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263" exitCode=0 Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.985549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" event={"ID":"70922bcf-bd0c-4eea-90e2-b49a02262d24","Type":"ContainerDied","Data":"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263"} Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.985622 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" event={"ID":"70922bcf-bd0c-4eea-90e2-b49a02262d24","Type":"ContainerDied","Data":"48268b34025eb0d90d694ead067b8d37fd1813177fe1e68bc435056a839c4381"} Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.985641 5004 scope.go:117] "RemoveContainer" containerID="1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263" Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.985624 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fw4gz" Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.989965 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" event={"ID":"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0","Type":"ContainerStarted","Data":"ee17711545a86ae1bdc5ea53dd770b67914850ab9618085b8d8d6913d5018ab5"} Dec 03 14:27:30 crc kubenswrapper[5004]: I1203 14:27:30.991364 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.005604 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.005761 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.006037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzgv4\" (UniqueName: \"kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.006066 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.006118 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.006151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb\") pod \"70922bcf-bd0c-4eea-90e2-b49a02262d24\" (UID: \"70922bcf-bd0c-4eea-90e2-b49a02262d24\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.017109 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4" (OuterVolumeSpecName: "kube-api-access-wzgv4") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "kube-api-access-wzgv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.081160 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.108663 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.108843 5004 scope.go:117] "RemoveContainer" containerID="65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.109070 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config" (OuterVolumeSpecName: "config") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.111291 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.111323 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzgv4\" (UniqueName: \"kubernetes.io/projected/70922bcf-bd0c-4eea-90e2-b49a02262d24-kube-api-access-wzgv4\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.111340 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.112520 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.128666 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.130065 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70922bcf-bd0c-4eea-90e2-b49a02262d24" (UID: "70922bcf-bd0c-4eea-90e2-b49a02262d24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.214689 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.214724 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70922bcf-bd0c-4eea-90e2-b49a02262d24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.287266 5004 scope.go:117] "RemoveContainer" containerID="1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263" Dec 03 14:27:31 crc kubenswrapper[5004]: E1203 14:27:31.287740 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263\": container with ID starting with 1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263 not found: ID does not exist" containerID="1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.287776 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263"} err="failed to get container status \"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263\": rpc error: code = NotFound desc = could not find container \"1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263\": container with ID starting with 1e0a7346694638f530fbaeed6d3f93c73a5206a0ff3446a80cd35047a8bba263 not found: ID does not exist" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.287802 5004 scope.go:117] "RemoveContainer" containerID="65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8" Dec 03 14:27:31 crc kubenswrapper[5004]: E1203 14:27:31.288048 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8\": container with ID starting with 65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8 not found: ID does not exist" containerID="65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.288071 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8"} err="failed to get container status \"65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8\": rpc error: code = NotFound desc = could not find container \"65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8\": container with ID starting with 65c9cd9fe54a54955c80ec09f256040a53732389e5ec3cbfea01c692c150bfe8 not found: ID does not exist" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.393041 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.401348 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fw4gz"] Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.497472 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.565554 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.626649 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" path="/var/lib/kubelet/pods/70922bcf-bd0c-4eea-90e2-b49a02262d24/volumes" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.725959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726053 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726388 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726491 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726551 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726597 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.726631 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kvw2\" (UniqueName: \"kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2\") pod \"a3dd33fc-70e6-4c71-903a-1337fa225e82\" (UID: \"a3dd33fc-70e6-4c71-903a-1337fa225e82\") " Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.728298 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.728739 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.733179 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts" (OuterVolumeSpecName: "scripts") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.733826 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2" (OuterVolumeSpecName: "kube-api-access-5kvw2") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "kube-api-access-5kvw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.759030 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.808189 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.823599 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data" (OuterVolumeSpecName: "config-data") pod "a3dd33fc-70e6-4c71-903a-1337fa225e82" (UID: "a3dd33fc-70e6-4c71-903a-1337fa225e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829458 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829505 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829518 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kvw2\" (UniqueName: \"kubernetes.io/projected/a3dd33fc-70e6-4c71-903a-1337fa225e82-kube-api-access-5kvw2\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829533 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829546 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829559 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3dd33fc-70e6-4c71-903a-1337fa225e82-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:31 crc kubenswrapper[5004]: I1203 14:27:31.829572 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dd33fc-70e6-4c71-903a-1337fa225e82-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.016092 5004 generic.go:334] "Generic (PLEG): container finished" podID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerID="a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f" exitCode=0 Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.016132 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerDied","Data":"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.016480 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3dd33fc-70e6-4c71-903a-1337fa225e82","Type":"ContainerDied","Data":"a7ba8666bbc439e42afbdbca1664998d592526e558e6afdfe65e813a67817932"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.016196 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.016501 5004 scope.go:117] "RemoveContainer" containerID="7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.019382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" event={"ID":"c4501cb8-8287-4e9d-83b2-858fcb7c431c","Type":"ContainerStarted","Data":"b857be55c236a673adc8bebe613d8ccc2d7adf9d0069096349a873deba70950e"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.026817 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerStarted","Data":"9c25ef0237577f4aeda03a7c86df913865005cdff9a8fca0acabae2b7919feb6"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.029370 5004 generic.go:334] "Generic (PLEG): container finished" podID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerID="fcdb4e59475e84328c24a47140d1c6de9d342714466115dee35d85a302fd3eab" exitCode=0 Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.029418 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" event={"ID":"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0","Type":"ContainerDied","Data":"fcdb4e59475e84328c24a47140d1c6de9d342714466115dee35d85a302fd3eab"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.034470 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56979bf587-swwll" event={"ID":"033d65ef-e917-445c-9c56-cffb8b328dbf","Type":"ContainerStarted","Data":"d54408dfff78f033f860a78d7cb87f55549f0b6ad9678fd1ddd7e3ce9cd333bc"} Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.060279 5004 scope.go:117] "RemoveContainer" containerID="11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.170197 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.191483 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.208318 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.208974 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="init" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.209079 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="init" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.209176 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="dnsmasq-dns" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.209768 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="dnsmasq-dns" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.209869 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="ceilometer-notification-agent" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.209926 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="ceilometer-notification-agent" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.210021 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="proxy-httpd" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210078 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="proxy-httpd" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.210166 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="sg-core" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210230 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="sg-core" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210510 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="sg-core" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210584 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="70922bcf-bd0c-4eea-90e2-b49a02262d24" containerName="dnsmasq-dns" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210665 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="proxy-httpd" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.210744 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" containerName="ceilometer-notification-agent" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.212927 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.219513 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.221160 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.231936 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.248003 5004 scope.go:117] "RemoveContainer" containerID="a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.326164 5004 scope.go:117] "RemoveContainer" containerID="7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.327076 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051\": container with ID starting with 7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051 not found: ID does not exist" containerID="7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.327123 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051"} err="failed to get container status \"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051\": rpc error: code = NotFound desc = could not find container \"7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051\": container with ID starting with 7fd6d20ec1579f8942a0336d60a31a7684a198f4a69864a871a456c851f2a051 not found: ID does not exist" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.327152 5004 scope.go:117] "RemoveContainer" containerID="11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.327487 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f\": container with ID starting with 11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f not found: ID does not exist" containerID="11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.327505 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f"} err="failed to get container status \"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f\": rpc error: code = NotFound desc = could not find container \"11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f\": container with ID starting with 11161d1f53557d05951b08b7290997964f7c8cff80301406413613cda8e8e10f not found: ID does not exist" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.327517 5004 scope.go:117] "RemoveContainer" containerID="a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f" Dec 03 14:27:32 crc kubenswrapper[5004]: E1203 14:27:32.327718 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f\": container with ID starting with a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f not found: ID does not exist" containerID="a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.327733 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f"} err="failed to get container status \"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f\": rpc error: code = NotFound desc = could not find container \"a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f\": container with ID starting with a8c98ff167c5afe3f48015b56e9b1902bffe6e3b776ece447b4e04a90c62b55f not found: ID does not exist" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.350717 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.350779 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mvj\" (UniqueName: \"kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.350884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.350914 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.350991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.351016 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.351067 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.441263 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452555 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452620 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452640 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452681 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452729 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.452756 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mvj\" (UniqueName: \"kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.453215 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.454139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.457721 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.458762 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.459735 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.461647 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.488057 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mvj\" (UniqueName: \"kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj\") pod \"ceilometer-0\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " pod="openstack/ceilometer-0" Dec 03 14:27:32 crc kubenswrapper[5004]: I1203 14:27:32.537588 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.057996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" event={"ID":"c4501cb8-8287-4e9d-83b2-858fcb7c431c","Type":"ContainerStarted","Data":"be6a0cfdbeef55a0f3d8510d3904313ce9487eb6a577b7aace6b47d02df22054"} Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.070050 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerStarted","Data":"6d53e5e3c44e90555a85b91a5271d9a90c1c05995acaa13cf2f1c3fca4936cca"} Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.076293 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" event={"ID":"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0","Type":"ContainerStarted","Data":"79694b788d43d70467df161240babad1bc3346b1861cae5356991f5700a97b91"} Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.077284 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.084222 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56979bf587-swwll" event={"ID":"033d65ef-e917-445c-9c56-cffb8b328dbf","Type":"ContainerStarted","Data":"3def212d8ec355bb39506838d71b3ce5bb354f7f21c5811b98324a0cbca54e34"} Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.097125 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c799f8fcd-gg559" podStartSLOduration=4.488797448 podStartE2EDuration="7.097103677s" podCreationTimestamp="2025-12-03 14:27:26 +0000 UTC" firstStartedPulling="2025-12-03 14:27:28.347683157 +0000 UTC m=+1261.096653393" lastFinishedPulling="2025-12-03 14:27:30.955989386 +0000 UTC m=+1263.704959622" observedRunningTime="2025-12-03 14:27:33.088917273 +0000 UTC m=+1265.837887509" watchObservedRunningTime="2025-12-03 14:27:33.097103677 +0000 UTC m=+1265.846073913" Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.122391 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" podStartSLOduration=4.122367709 podStartE2EDuration="4.122367709s" podCreationTimestamp="2025-12-03 14:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:33.11748926 +0000 UTC m=+1265.866459496" watchObservedRunningTime="2025-12-03 14:27:33.122367709 +0000 UTC m=+1265.871337945" Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.133062 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.152659 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56979bf587-swwll" podStartSLOduration=4.303075869 podStartE2EDuration="7.152635954s" podCreationTimestamp="2025-12-03 14:27:26 +0000 UTC" firstStartedPulling="2025-12-03 14:27:28.120200684 +0000 UTC m=+1260.869170920" lastFinishedPulling="2025-12-03 14:27:30.969760769 +0000 UTC m=+1263.718731005" observedRunningTime="2025-12-03 14:27:33.143304448 +0000 UTC m=+1265.892274684" watchObservedRunningTime="2025-12-03 14:27:33.152635954 +0000 UTC m=+1265.901606190" Dec 03 14:27:33 crc kubenswrapper[5004]: I1203 14:27:33.634757 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dd33fc-70e6-4c71-903a-1337fa225e82" path="/var/lib/kubelet/pods/a3dd33fc-70e6-4c71-903a-1337fa225e82/volumes" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.017742 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f76744786-jfgf7"] Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.020470 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.026743 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.026894 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.038395 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f76744786-jfgf7"] Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097286 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data-custom\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097384 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-internal-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097425 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-public-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097483 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f8cc\" (UniqueName: \"kubernetes.io/projected/939c0a06-65e2-45ea-b58d-7d4cc431207b-kube-api-access-6f8cc\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939c0a06-65e2-45ea-b58d-7d4cc431207b-logs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097679 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.097730 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-combined-ca-bundle\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.106451 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerStarted","Data":"c07040fe4f2ab205567662214e5cd2b9cbf001a7c57a0ff3781d638b3202230b"} Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.106534 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerStarted","Data":"3ca56be3dc9d414ce84ec0c2da3b9deae196eb6953274554d1d4e8425fbb4933"} Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.112300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerStarted","Data":"c0ae94b2f5a01a864e7882903696e9bda1debfed67efcbf3809c711c43daa91c"} Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.112385 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.112396 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api-log" containerID="cri-o://6d53e5e3c44e90555a85b91a5271d9a90c1c05995acaa13cf2f1c3fca4936cca" gracePeriod=30 Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.112449 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api" containerID="cri-o://c0ae94b2f5a01a864e7882903696e9bda1debfed67efcbf3809c711c43daa91c" gracePeriod=30 Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.127300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerStarted","Data":"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c"} Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.139626 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.139605161 podStartE2EDuration="5.139605161s" podCreationTimestamp="2025-12-03 14:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:34.138931601 +0000 UTC m=+1266.887901837" watchObservedRunningTime="2025-12-03 14:27:34.139605161 +0000 UTC m=+1266.888575397" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.198968 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199067 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-combined-ca-bundle\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data-custom\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199211 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-internal-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199248 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-public-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199271 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f8cc\" (UniqueName: \"kubernetes.io/projected/939c0a06-65e2-45ea-b58d-7d4cc431207b-kube-api-access-6f8cc\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.199299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939c0a06-65e2-45ea-b58d-7d4cc431207b-logs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.279747 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939c0a06-65e2-45ea-b58d-7d4cc431207b-logs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.282937 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-public-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.283576 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data-custom\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.283721 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-combined-ca-bundle\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.285476 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-internal-tls-certs\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.286611 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/939c0a06-65e2-45ea-b58d-7d4cc431207b-config-data\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.300519 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f8cc\" (UniqueName: \"kubernetes.io/projected/939c0a06-65e2-45ea-b58d-7d4cc431207b-kube-api-access-6f8cc\") pod \"barbican-api-5f76744786-jfgf7\" (UID: \"939c0a06-65e2-45ea-b58d-7d4cc431207b\") " pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:34 crc kubenswrapper[5004]: I1203 14:27:34.347355 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:35 crc kubenswrapper[5004]: W1203 14:27:35.008015 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod939c0a06_65e2_45ea_b58d_7d4cc431207b.slice/crio-a8db4bd32ab12b8cd1c2ffa3beb7754fa98f88bd30d91e8243c4b55a641e0a81 WatchSource:0}: Error finding container a8db4bd32ab12b8cd1c2ffa3beb7754fa98f88bd30d91e8243c4b55a641e0a81: Status 404 returned error can't find the container with id a8db4bd32ab12b8cd1c2ffa3beb7754fa98f88bd30d91e8243c4b55a641e0a81 Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.026279 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f76744786-jfgf7"] Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.170322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerStarted","Data":"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.200277 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerStarted","Data":"cd2dcac8db020559ba23390885c37b2a2509d530d6deff69030578500225185c"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220289 5004 generic.go:334] "Generic (PLEG): container finished" podID="ae8060b4-671e-42a0-a603-3412500ddd72" containerID="c0ae94b2f5a01a864e7882903696e9bda1debfed67efcbf3809c711c43daa91c" exitCode=0 Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220321 5004 generic.go:334] "Generic (PLEG): container finished" podID="ae8060b4-671e-42a0-a603-3412500ddd72" containerID="6d53e5e3c44e90555a85b91a5271d9a90c1c05995acaa13cf2f1c3fca4936cca" exitCode=143 Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220386 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerDied","Data":"c0ae94b2f5a01a864e7882903696e9bda1debfed67efcbf3809c711c43daa91c"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220421 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerDied","Data":"6d53e5e3c44e90555a85b91a5271d9a90c1c05995acaa13cf2f1c3fca4936cca"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220435 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ae8060b4-671e-42a0-a603-3412500ddd72","Type":"ContainerDied","Data":"9c25ef0237577f4aeda03a7c86df913865005cdff9a8fca0acabae2b7919feb6"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.220447 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c25ef0237577f4aeda03a7c86df913865005cdff9a8fca0acabae2b7919feb6" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.223161 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f76744786-jfgf7" event={"ID":"939c0a06-65e2-45ea-b58d-7d4cc431207b","Type":"ContainerStarted","Data":"a8db4bd32ab12b8cd1c2ffa3beb7754fa98f88bd30d91e8243c4b55a641e0a81"} Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.232944 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.255346 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.825705397 podStartE2EDuration="6.255325857s" podCreationTimestamp="2025-12-03 14:27:29 +0000 UTC" firstStartedPulling="2025-12-03 14:27:30.386303579 +0000 UTC m=+1263.135273815" lastFinishedPulling="2025-12-03 14:27:31.815924039 +0000 UTC m=+1264.564894275" observedRunningTime="2025-12-03 14:27:35.200525291 +0000 UTC m=+1267.949495527" watchObservedRunningTime="2025-12-03 14:27:35.255325857 +0000 UTC m=+1268.004296093" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.334521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335002 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mwqg\" (UniqueName: \"kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335053 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335080 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335101 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335123 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.335272 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs\") pod \"ae8060b4-671e-42a0-a603-3412500ddd72\" (UID: \"ae8060b4-671e-42a0-a603-3412500ddd72\") " Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.336107 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae8060b4-671e-42a0-a603-3412500ddd72-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.336780 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs" (OuterVolumeSpecName: "logs") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.354520 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg" (OuterVolumeSpecName: "kube-api-access-2mwqg") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "kube-api-access-2mwqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.354621 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.390351 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts" (OuterVolumeSpecName: "scripts") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.438165 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mwqg\" (UniqueName: \"kubernetes.io/projected/ae8060b4-671e-42a0-a603-3412500ddd72-kube-api-access-2mwqg\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.438216 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.438229 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.438238 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae8060b4-671e-42a0-a603-3412500ddd72-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.442162 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.446005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data" (OuterVolumeSpecName: "config-data") pod "ae8060b4-671e-42a0-a603-3412500ddd72" (UID: "ae8060b4-671e-42a0-a603-3412500ddd72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.539678 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:35 crc kubenswrapper[5004]: I1203 14:27:35.539710 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8060b4-671e-42a0-a603-3412500ddd72-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.234503 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerStarted","Data":"00044496e8531ccbd44a5638ff8e6d16a0ece6e5fdb86717ee4e2426f2e4c720"} Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.238904 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.238971 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f76744786-jfgf7" event={"ID":"939c0a06-65e2-45ea-b58d-7d4cc431207b","Type":"ContainerStarted","Data":"d92bd7da3ad2fb9c8f3b7bdc273f515c34a6b768a0175822d2b3144b39ce63b9"} Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.239054 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f76744786-jfgf7" event={"ID":"939c0a06-65e2-45ea-b58d-7d4cc431207b","Type":"ContainerStarted","Data":"e9f815b111fe49284e3375627679e301cfd2268a82f7d8e9c94928abb93a7c73"} Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.239467 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.283169 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f76744786-jfgf7" podStartSLOduration=3.283144931 podStartE2EDuration="3.283144931s" podCreationTimestamp="2025-12-03 14:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:36.27367243 +0000 UTC m=+1269.022642666" watchObservedRunningTime="2025-12-03 14:27:36.283144931 +0000 UTC m=+1269.032115187" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.292121 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.309619 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.331196 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:36 crc kubenswrapper[5004]: E1203 14:27:36.331686 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.331706 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api" Dec 03 14:27:36 crc kubenswrapper[5004]: E1203 14:27:36.331757 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api-log" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.331770 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api-log" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.332087 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api-log" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.332120 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" containerName="cinder-api" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.333367 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.336599 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.338614 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.338790 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.341995 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455219 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455307 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-scripts\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data-custom\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455440 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e715befa-4ae4-4466-beb4-ee8939e3bb86-logs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455624 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e715befa-4ae4-4466-beb4-ee8939e3bb86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455678 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxbk\" (UniqueName: \"kubernetes.io/projected/e715befa-4ae4-4466-beb4-ee8939e3bb86-kube-api-access-7xxbk\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455722 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.455751 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data-custom\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558120 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558171 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e715befa-4ae4-4466-beb4-ee8939e3bb86-logs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558238 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e715befa-4ae4-4466-beb4-ee8939e3bb86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxbk\" (UniqueName: \"kubernetes.io/projected/e715befa-4ae4-4466-beb4-ee8939e3bb86-kube-api-access-7xxbk\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558293 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.558390 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-scripts\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.559442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e715befa-4ae4-4466-beb4-ee8939e3bb86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.559815 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e715befa-4ae4-4466-beb4-ee8939e3bb86-logs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.564752 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.564851 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.564926 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.565603 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data-custom\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.575887 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-config-data\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.596163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e715befa-4ae4-4466-beb4-ee8939e3bb86-scripts\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.596479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxbk\" (UniqueName: \"kubernetes.io/projected/e715befa-4ae4-4466-beb4-ee8939e3bb86-kube-api-access-7xxbk\") pod \"cinder-api-0\" (UID: \"e715befa-4ae4-4466-beb4-ee8939e3bb86\") " pod="openstack/cinder-api-0" Dec 03 14:27:36 crc kubenswrapper[5004]: I1203 14:27:36.654179 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 14:27:37 crc kubenswrapper[5004]: I1203 14:27:37.123097 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 14:27:37 crc kubenswrapper[5004]: I1203 14:27:37.254563 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e715befa-4ae4-4466-beb4-ee8939e3bb86","Type":"ContainerStarted","Data":"43abac6e35e55d24ee767fa333b81596aa869b8ab936b5e5a44f946be21ddb09"} Dec 03 14:27:37 crc kubenswrapper[5004]: I1203 14:27:37.254618 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:37 crc kubenswrapper[5004]: I1203 14:27:37.641707 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8060b4-671e-42a0-a603-3412500ddd72" path="/var/lib/kubelet/pods/ae8060b4-671e-42a0-a603-3412500ddd72/volumes" Dec 03 14:27:38 crc kubenswrapper[5004]: I1203 14:27:38.277713 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e715befa-4ae4-4466-beb4-ee8939e3bb86","Type":"ContainerStarted","Data":"4eaebcd2054908d9bbee70184839d3da78c6d810f83757af2884e6abde021f56"} Dec 03 14:27:38 crc kubenswrapper[5004]: I1203 14:27:38.295441 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerStarted","Data":"09832fabe3456743ce5a7f91860612d763c6153eefaf4ddd5c7bbca6bacf2d16"} Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.104298 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.125422 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.263259064 podStartE2EDuration="7.125397827s" podCreationTimestamp="2025-12-03 14:27:32 +0000 UTC" firstStartedPulling="2025-12-03 14:27:33.18430031 +0000 UTC m=+1265.933270536" lastFinishedPulling="2025-12-03 14:27:37.046439063 +0000 UTC m=+1269.795409299" observedRunningTime="2025-12-03 14:27:38.320982611 +0000 UTC m=+1271.069952867" watchObservedRunningTime="2025-12-03 14:27:39.125397827 +0000 UTC m=+1271.874368063" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.312501 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e715befa-4ae4-4466-beb4-ee8939e3bb86","Type":"ContainerStarted","Data":"768778ee7cb711cfa73f0604a68bffc2f685b02e32c9890973694786beb8d2f4"} Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.312707 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.314120 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.482957 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.503943 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.503924179 podStartE2EDuration="3.503924179s" podCreationTimestamp="2025-12-03 14:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:39.346192519 +0000 UTC m=+1272.095162765" watchObservedRunningTime="2025-12-03 14:27:39.503924179 +0000 UTC m=+1272.252894415" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.560492 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.587773 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.666194 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.760074 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.760311 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="dnsmasq-dns" containerID="cri-o://5d97190441b12e18ba6b219a7b4fcf6e8f6e174ece11a627f7a20ce5b56993a0" gracePeriod=10 Dec 03 14:27:39 crc kubenswrapper[5004]: I1203 14:27:39.859705 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.323706 5004 generic.go:334] "Generic (PLEG): container finished" podID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerID="5d97190441b12e18ba6b219a7b4fcf6e8f6e174ece11a627f7a20ce5b56993a0" exitCode=0 Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.323914 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" event={"ID":"928b00e0-b667-4430-9bd3-2423bf037d6e","Type":"ContainerDied","Data":"5d97190441b12e18ba6b219a7b4fcf6e8f6e174ece11a627f7a20ce5b56993a0"} Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.324150 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" event={"ID":"928b00e0-b667-4430-9bd3-2423bf037d6e","Type":"ContainerDied","Data":"e0a79a2748bcd02709c867e84e56ce0aabcd35d18c2acac4fcfd9be05d769e8a"} Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.324170 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a79a2748bcd02709c867e84e56ce0aabcd35d18c2acac4fcfd9be05d769e8a" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.371605 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.407523 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fts6k\" (UniqueName: \"kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515386 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515469 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515500 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515540 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.515597 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb\") pod \"928b00e0-b667-4430-9bd3-2423bf037d6e\" (UID: \"928b00e0-b667-4430-9bd3-2423bf037d6e\") " Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.545311 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k" (OuterVolumeSpecName: "kube-api-access-fts6k") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "kube-api-access-fts6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.592794 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.621419 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.621451 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fts6k\" (UniqueName: \"kubernetes.io/projected/928b00e0-b667-4430-9bd3-2423bf037d6e-kube-api-access-fts6k\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.622602 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.630327 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config" (OuterVolumeSpecName: "config") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.634273 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.645982 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "928b00e0-b667-4430-9bd3-2423bf037d6e" (UID: "928b00e0-b667-4430-9bd3-2423bf037d6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.724158 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.724206 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.724218 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.724232 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/928b00e0-b667-4430-9bd3-2423bf037d6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.891490 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:27:40 crc kubenswrapper[5004]: I1203 14:27:40.947435 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.375899 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="cinder-scheduler" containerID="cri-o://ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c" gracePeriod=30 Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.376160 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="probe" containerID="cri-o://f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12" gracePeriod=30 Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.375971 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-89jhk" Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.407021 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66456bfc4f-v6lrf" Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.504248 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.539462 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-89jhk"] Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.548230 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.548683 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67457f9876-l5kn6" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-api" containerID="cri-o://dd591f0c931b6816b155aec5b3c908f746848a520186c68192a606ea930fb2ac" gracePeriod=30 Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.548845 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67457f9876-l5kn6" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-httpd" containerID="cri-o://9df7744ac7308982f5849dadf0c0067607f2893d9d53424d70637e024304f2cb" gracePeriod=30 Dec 03 14:27:41 crc kubenswrapper[5004]: I1203 14:27:41.643807 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" path="/var/lib/kubelet/pods/928b00e0-b667-4430-9bd3-2423bf037d6e/volumes" Dec 03 14:27:42 crc kubenswrapper[5004]: I1203 14:27:42.383562 5004 generic.go:334] "Generic (PLEG): container finished" podID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerID="9df7744ac7308982f5849dadf0c0067607f2893d9d53424d70637e024304f2cb" exitCode=0 Dec 03 14:27:42 crc kubenswrapper[5004]: I1203 14:27:42.383621 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerDied","Data":"9df7744ac7308982f5849dadf0c0067607f2893d9d53424d70637e024304f2cb"} Dec 03 14:27:42 crc kubenswrapper[5004]: I1203 14:27:42.589707 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.342701 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-ccd4cc976-4jrqc" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.421751 5004 generic.go:334] "Generic (PLEG): container finished" podID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerID="dd591f0c931b6816b155aec5b3c908f746848a520186c68192a606ea930fb2ac" exitCode=0 Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.421829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerDied","Data":"dd591f0c931b6816b155aec5b3c908f746848a520186c68192a606ea930fb2ac"} Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.421898 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67457f9876-l5kn6" event={"ID":"4d0bed93-69af-4c49-9d33-e6b847a06885","Type":"ContainerDied","Data":"b24ad8325c3b9ffbd2d5b5b2a5eac916da8124e900ce18e3c67a59edfcff8578"} Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.421912 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24ad8325c3b9ffbd2d5b5b2a5eac916da8124e900ce18e3c67a59edfcff8578" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.450310 5004 generic.go:334] "Generic (PLEG): container finished" podID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerID="f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12" exitCode=0 Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.451508 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerDied","Data":"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12"} Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.510953 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.602322 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config\") pod \"4d0bed93-69af-4c49-9d33-e6b847a06885\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.602402 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config\") pod \"4d0bed93-69af-4c49-9d33-e6b847a06885\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.602463 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle\") pod \"4d0bed93-69af-4c49-9d33-e6b847a06885\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.610017 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4d0bed93-69af-4c49-9d33-e6b847a06885" (UID: "4d0bed93-69af-4c49-9d33-e6b847a06885"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.685085 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config" (OuterVolumeSpecName: "config") pod "4d0bed93-69af-4c49-9d33-e6b847a06885" (UID: "4d0bed93-69af-4c49-9d33-e6b847a06885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.689018 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d0bed93-69af-4c49-9d33-e6b847a06885" (UID: "4d0bed93-69af-4c49-9d33-e6b847a06885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.704317 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxz9\" (UniqueName: \"kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9\") pod \"4d0bed93-69af-4c49-9d33-e6b847a06885\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.704370 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs\") pod \"4d0bed93-69af-4c49-9d33-e6b847a06885\" (UID: \"4d0bed93-69af-4c49-9d33-e6b847a06885\") " Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.704814 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.704840 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.704854 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.709135 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9" (OuterVolumeSpecName: "kube-api-access-rsxz9") pod "4d0bed93-69af-4c49-9d33-e6b847a06885" (UID: "4d0bed93-69af-4c49-9d33-e6b847a06885"). InnerVolumeSpecName "kube-api-access-rsxz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.803990 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4d0bed93-69af-4c49-9d33-e6b847a06885" (UID: "4d0bed93-69af-4c49-9d33-e6b847a06885"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.806835 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsxz9\" (UniqueName: \"kubernetes.io/projected/4d0bed93-69af-4c49-9d33-e6b847a06885-kube-api-access-rsxz9\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.806880 5004 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0bed93-69af-4c49-9d33-e6b847a06885-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:43 crc kubenswrapper[5004]: I1203 14:27:43.951462 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79df97d86b-4dr9p" Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.030188 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.030435 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon-log" containerID="cri-o://170abb84be82d97c90bb1e36346b7122190618e65add8272765ea09c84996d53" gracePeriod=30 Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.030824 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" containerID="cri-o://85251560e3d53350eb3d01e94ef00b8672f076f3cfd3f4a9681e204313af876e" gracePeriod=30 Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.041557 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.460335 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67457f9876-l5kn6" Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.514879 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:44 crc kubenswrapper[5004]: I1203 14:27:44.531012 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67457f9876-l5kn6"] Dec 03 14:27:45 crc kubenswrapper[5004]: I1203 14:27:45.629994 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" path="/var/lib/kubelet/pods/4d0bed93-69af-4c49-9d33-e6b847a06885/volumes" Dec 03 14:27:45 crc kubenswrapper[5004]: E1203 14:27:45.714441 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod690e6780_2a59_47d5_8485_6ca1f13cb0de.slice/crio-ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 14:27:45 crc kubenswrapper[5004]: I1203 14:27:45.993815 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.164819 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.164937 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.164975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.165224 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj2q9\" (UniqueName: \"kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.165277 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.165458 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.165484 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data\") pod \"690e6780-2a59-47d5-8485-6ca1f13cb0de\" (UID: \"690e6780-2a59-47d5-8485-6ca1f13cb0de\") " Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.166064 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/690e6780-2a59-47d5-8485-6ca1f13cb0de-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.173096 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9" (OuterVolumeSpecName: "kube-api-access-kj2q9") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "kube-api-access-kj2q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.189253 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.189899 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts" (OuterVolumeSpecName: "scripts") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.232119 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.252419 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.270027 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.270093 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.270108 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj2q9\" (UniqueName: \"kubernetes.io/projected/690e6780-2a59-47d5-8485-6ca1f13cb0de-kube-api-access-kj2q9\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.270129 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.283694 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data" (OuterVolumeSpecName: "config-data") pod "690e6780-2a59-47d5-8485-6ca1f13cb0de" (UID: "690e6780-2a59-47d5-8485-6ca1f13cb0de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.297086 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.372610 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/690e6780-2a59-47d5-8485-6ca1f13cb0de-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.379292 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f76744786-jfgf7" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.468067 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.469165 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578849f58d-fwczz" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api" containerID="cri-o://532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e" gracePeriod=30 Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.471297 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578849f58d-fwczz" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api-log" containerID="cri-o://0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d" gracePeriod=30 Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.540658 5004 generic.go:334] "Generic (PLEG): container finished" podID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerID="ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c" exitCode=0 Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.541315 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.541558 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerDied","Data":"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c"} Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.541597 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"690e6780-2a59-47d5-8485-6ca1f13cb0de","Type":"ContainerDied","Data":"a8d30f44c9d753ab5d4cb4c0050f70f158490350cefb321244ee49c24eee4e6a"} Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.541618 5004 scope.go:117] "RemoveContainer" containerID="f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.634061 5004 scope.go:117] "RemoveContainer" containerID="ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.640943 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.696901 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.713805 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715063 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-api" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715090 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-api" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715104 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="cinder-scheduler" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715116 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="cinder-scheduler" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715162 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="init" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715172 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="init" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715190 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="dnsmasq-dns" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715199 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="dnsmasq-dns" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715210 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-httpd" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715219 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-httpd" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.715233 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="probe" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715242 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="probe" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715453 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="928b00e0-b667-4430-9bd3-2423bf037d6e" containerName="dnsmasq-dns" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715473 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="cinder-scheduler" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715487 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-api" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715501 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0bed93-69af-4c49-9d33-e6b847a06885" containerName="neutron-httpd" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.715512 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" containerName="probe" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.720476 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.724727 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.737717 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.743559 5004 scope.go:117] "RemoveContainer" containerID="f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.745720 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12\": container with ID starting with f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12 not found: ID does not exist" containerID="f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.745803 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12"} err="failed to get container status \"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12\": rpc error: code = NotFound desc = could not find container \"f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12\": container with ID starting with f041be286f2160b40112fc2bcc23a6d13b61bd0d19105f06a737b539ac348a12 not found: ID does not exist" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.745833 5004 scope.go:117] "RemoveContainer" containerID="ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c" Dec 03 14:27:46 crc kubenswrapper[5004]: E1203 14:27:46.746201 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c\": container with ID starting with ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c not found: ID does not exist" containerID="ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.746227 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c"} err="failed to get container status \"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c\": rpc error: code = NotFound desc = could not find container \"ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c\": container with ID starting with ab7f90d5dcccb693d7d79aaac482bb1bd4c25a937399183094370f66832f919c not found: ID does not exist" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.777969 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhmx\" (UniqueName: \"kubernetes.io/projected/7502099c-9fa6-4071-8ce6-4471b9f44f78-kube-api-access-5qhmx\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.778048 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.778128 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-scripts\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.778195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7502099c-9fa6-4071-8ce6-4471b9f44f78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.778236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.778286 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879604 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-scripts\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879677 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7502099c-9fa6-4071-8ce6-4471b9f44f78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879711 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879755 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhmx\" (UniqueName: \"kubernetes.io/projected/7502099c-9fa6-4071-8ce6-4471b9f44f78-kube-api-access-5qhmx\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879829 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7502099c-9fa6-4071-8ce6-4471b9f44f78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.879894 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.887430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-scripts\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.887630 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.887827 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.896599 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7502099c-9fa6-4071-8ce6-4471b9f44f78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:46 crc kubenswrapper[5004]: I1203 14:27:46.902229 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhmx\" (UniqueName: \"kubernetes.io/projected/7502099c-9fa6-4071-8ce6-4471b9f44f78-kube-api-access-5qhmx\") pod \"cinder-scheduler-0\" (UID: \"7502099c-9fa6-4071-8ce6-4471b9f44f78\") " pod="openstack/cinder-scheduler-0" Dec 03 14:27:47 crc kubenswrapper[5004]: I1203 14:27:47.067244 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 14:27:47 crc kubenswrapper[5004]: I1203 14:27:47.553394 5004 generic.go:334] "Generic (PLEG): container finished" podID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerID="0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d" exitCode=143 Dec 03 14:27:47 crc kubenswrapper[5004]: I1203 14:27:47.553658 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerDied","Data":"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d"} Dec 03 14:27:47 crc kubenswrapper[5004]: I1203 14:27:47.601276 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 14:27:47 crc kubenswrapper[5004]: W1203 14:27:47.620402 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7502099c_9fa6_4071_8ce6_4471b9f44f78.slice/crio-fc8d9a6cf7e742d1aa55d4bd15752c4b7a091d76e00f1f14a2aa3d3e2f8f4159 WatchSource:0}: Error finding container fc8d9a6cf7e742d1aa55d4bd15752c4b7a091d76e00f1f14a2aa3d3e2f8f4159: Status 404 returned error can't find the container with id fc8d9a6cf7e742d1aa55d4bd15752c4b7a091d76e00f1f14a2aa3d3e2f8f4159 Dec 03 14:27:47 crc kubenswrapper[5004]: I1203 14:27:47.631145 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690e6780-2a59-47d5-8485-6ca1f13cb0de" path="/var/lib/kubelet/pods/690e6780-2a59-47d5-8485-6ca1f13cb0de/volumes" Dec 03 14:27:48 crc kubenswrapper[5004]: I1203 14:27:48.570113 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7502099c-9fa6-4071-8ce6-4471b9f44f78","Type":"ContainerStarted","Data":"e1f52eb9016ab90c788d392f460aa459cbf1468695424efa00ab12c392015edf"} Dec 03 14:27:48 crc kubenswrapper[5004]: I1203 14:27:48.570616 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7502099c-9fa6-4071-8ce6-4471b9f44f78","Type":"ContainerStarted","Data":"fc8d9a6cf7e742d1aa55d4bd15752c4b7a091d76e00f1f14a2aa3d3e2f8f4159"} Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.174499 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.505127 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:34578->10.217.0.145:8443: read: connection reset by peer" Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.584162 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7502099c-9fa6-4071-8ce6-4471b9f44f78","Type":"ContainerStarted","Data":"863266572e0d4a4b96d802b596727e0318689e558722fbd685e2d3fe71f72e13"} Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.592069 5004 generic.go:334] "Generic (PLEG): container finished" podID="962799d4-1cef-40f7-a1d8-e4231680a856" containerID="85251560e3d53350eb3d01e94ef00b8672f076f3cfd3f4a9681e204313af876e" exitCode=0 Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.592134 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerDied","Data":"85251560e3d53350eb3d01e94ef00b8672f076f3cfd3f4a9681e204313af876e"} Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.610759 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.610715077 podStartE2EDuration="3.610715077s" podCreationTimestamp="2025-12-03 14:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:27:49.609167943 +0000 UTC m=+1282.358138179" watchObservedRunningTime="2025-12-03 14:27:49.610715077 +0000 UTC m=+1282.359685313" Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.681412 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578849f58d-fwczz" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:39570->10.217.0.158:9311: read: connection reset by peer" Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.681496 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578849f58d-fwczz" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:39586->10.217.0.158:9311: read: connection reset by peer" Dec 03 14:27:49 crc kubenswrapper[5004]: I1203 14:27:49.748062 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-65f67fcd5d-5p75z" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.297333 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.472797 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w4cj\" (UniqueName: \"kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj\") pod \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.472846 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom\") pod \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.472914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle\") pod \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.473654 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data\") pod \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.473835 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs\") pod \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\" (UID: \"3eb05448-f571-4a51-b35b-4e2d2eeed2fc\") " Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.474330 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs" (OuterVolumeSpecName: "logs") pod "3eb05448-f571-4a51-b35b-4e2d2eeed2fc" (UID: "3eb05448-f571-4a51-b35b-4e2d2eeed2fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.475138 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.481988 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3eb05448-f571-4a51-b35b-4e2d2eeed2fc" (UID: "3eb05448-f571-4a51-b35b-4e2d2eeed2fc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.489630 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj" (OuterVolumeSpecName: "kube-api-access-6w4cj") pod "3eb05448-f571-4a51-b35b-4e2d2eeed2fc" (UID: "3eb05448-f571-4a51-b35b-4e2d2eeed2fc"). InnerVolumeSpecName "kube-api-access-6w4cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.515591 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb05448-f571-4a51-b35b-4e2d2eeed2fc" (UID: "3eb05448-f571-4a51-b35b-4e2d2eeed2fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.547470 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data" (OuterVolumeSpecName: "config-data") pod "3eb05448-f571-4a51-b35b-4e2d2eeed2fc" (UID: "3eb05448-f571-4a51-b35b-4e2d2eeed2fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.577153 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w4cj\" (UniqueName: \"kubernetes.io/projected/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-kube-api-access-6w4cj\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.577201 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.577213 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.577227 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb05448-f571-4a51-b35b-4e2d2eeed2fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.606308 5004 generic.go:334] "Generic (PLEG): container finished" podID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerID="532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e" exitCode=0 Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.606542 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerDied","Data":"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e"} Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.606587 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578849f58d-fwczz" event={"ID":"3eb05448-f571-4a51-b35b-4e2d2eeed2fc","Type":"ContainerDied","Data":"40aac21c881dad59f0cfd27893741f48944a16d4bf0008701c4efe11e6dcbc82"} Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.606605 5004 scope.go:117] "RemoveContainer" containerID="532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.607167 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578849f58d-fwczz" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.660533 5004 scope.go:117] "RemoveContainer" containerID="0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.662252 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.673691 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-578849f58d-fwczz"] Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.689219 5004 scope.go:117] "RemoveContainer" containerID="532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e" Dec 03 14:27:50 crc kubenswrapper[5004]: E1203 14:27:50.689789 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e\": container with ID starting with 532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e not found: ID does not exist" containerID="532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.689916 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e"} err="failed to get container status \"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e\": rpc error: code = NotFound desc = could not find container \"532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e\": container with ID starting with 532ae170010c25fd278cbf2a16360debb5668dcb2511c643f88748731beac42e not found: ID does not exist" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.690019 5004 scope.go:117] "RemoveContainer" containerID="0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d" Dec 03 14:27:50 crc kubenswrapper[5004]: E1203 14:27:50.690415 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d\": container with ID starting with 0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d not found: ID does not exist" containerID="0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d" Dec 03 14:27:50 crc kubenswrapper[5004]: I1203 14:27:50.690467 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d"} err="failed to get container status \"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d\": rpc error: code = NotFound desc = could not find container \"0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d\": container with ID starting with 0670c6ef4d9c05d7c1438d8fa10ce24b76a0178aab597403301eb3beae10622d not found: ID does not exist" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.130768 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 14:27:51 crc kubenswrapper[5004]: E1203 14:27:51.131173 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api-log" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.131190 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api-log" Dec 03 14:27:51 crc kubenswrapper[5004]: E1203 14:27:51.131216 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.131224 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.131416 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.131441 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" containerName="barbican-api-log" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.132075 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.134232 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.134474 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.139010 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wc5vj" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.171893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.290277 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.290554 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config-secret\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.290622 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.290657 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z77mq\" (UniqueName: \"kubernetes.io/projected/32ad9006-53eb-4a4f-8ec0-8c287231374e-kube-api-access-z77mq\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.392398 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z77mq\" (UniqueName: \"kubernetes.io/projected/32ad9006-53eb-4a4f-8ec0-8c287231374e-kube-api-access-z77mq\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.392508 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.392557 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config-secret\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.392623 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.393560 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.399549 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.405094 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32ad9006-53eb-4a4f-8ec0-8c287231374e-openstack-config-secret\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.422488 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z77mq\" (UniqueName: \"kubernetes.io/projected/32ad9006-53eb-4a4f-8ec0-8c287231374e-kube-api-access-z77mq\") pod \"openstackclient\" (UID: \"32ad9006-53eb-4a4f-8ec0-8c287231374e\") " pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.462434 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.632443 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb05448-f571-4a51-b35b-4e2d2eeed2fc" path="/var/lib/kubelet/pods/3eb05448-f571-4a51-b35b-4e2d2eeed2fc/volumes" Dec 03 14:27:51 crc kubenswrapper[5004]: I1203 14:27:51.720830 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 14:27:52 crc kubenswrapper[5004]: I1203 14:27:52.067430 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 14:27:52 crc kubenswrapper[5004]: I1203 14:27:52.648315 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"32ad9006-53eb-4a4f-8ec0-8c287231374e","Type":"ContainerStarted","Data":"d16d7d177cc560621449d30ccd5f3948ba770380484301349ae3ab2a183a61af"} Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.908240 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b9d87dc5f-trzlj"] Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.914250 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.917599 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.917896 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.918930 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 14:27:55 crc kubenswrapper[5004]: I1203 14:27:55.927457 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b9d87dc5f-trzlj"] Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.078491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdbhz\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-kube-api-access-rdbhz\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.078821 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-config-data\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.078922 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-run-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.078974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-etc-swift\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.079004 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-log-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.079025 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-combined-ca-bundle\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.079100 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-internal-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.079145 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-public-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdbhz\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-kube-api-access-rdbhz\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181344 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-config-data\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-run-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181437 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-etc-swift\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181459 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-log-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181474 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-combined-ca-bundle\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181524 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-internal-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.181554 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-public-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.182148 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-run-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.182187 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37c9311f-7b12-474a-ba76-c7c534f55e55-log-httpd\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.188381 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-internal-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.189420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-public-tls-certs\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.191561 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-etc-swift\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.193574 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-combined-ca-bundle\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.196062 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c9311f-7b12-474a-ba76-c7c534f55e55-config-data\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.208016 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdbhz\" (UniqueName: \"kubernetes.io/projected/37c9311f-7b12-474a-ba76-c7c534f55e55-kube-api-access-rdbhz\") pod \"swift-proxy-b9d87dc5f-trzlj\" (UID: \"37c9311f-7b12-474a-ba76-c7c534f55e55\") " pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.253638 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.936184 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.936723 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-central-agent" containerID="cri-o://c07040fe4f2ab205567662214e5cd2b9cbf001a7c57a0ff3781d638b3202230b" gracePeriod=30 Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.936793 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="sg-core" containerID="cri-o://00044496e8531ccbd44a5638ff8e6d16a0ece6e5fdb86717ee4e2426f2e4c720" gracePeriod=30 Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.936793 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" containerID="cri-o://09832fabe3456743ce5a7f91860612d763c6153eefaf4ddd5c7bbca6bacf2d16" gracePeriod=30 Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.936839 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-notification-agent" containerID="cri-o://cd2dcac8db020559ba23390885c37b2a2509d530d6deff69030578500225185c" gracePeriod=30 Dec 03 14:27:56 crc kubenswrapper[5004]: I1203 14:27:56.946638 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": EOF" Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.336936 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700334 5004 generic.go:334] "Generic (PLEG): container finished" podID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerID="09832fabe3456743ce5a7f91860612d763c6153eefaf4ddd5c7bbca6bacf2d16" exitCode=0 Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700364 5004 generic.go:334] "Generic (PLEG): container finished" podID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerID="00044496e8531ccbd44a5638ff8e6d16a0ece6e5fdb86717ee4e2426f2e4c720" exitCode=2 Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700372 5004 generic.go:334] "Generic (PLEG): container finished" podID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerID="c07040fe4f2ab205567662214e5cd2b9cbf001a7c57a0ff3781d638b3202230b" exitCode=0 Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700393 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerDied","Data":"09832fabe3456743ce5a7f91860612d763c6153eefaf4ddd5c7bbca6bacf2d16"} Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700418 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerDied","Data":"00044496e8531ccbd44a5638ff8e6d16a0ece6e5fdb86717ee4e2426f2e4c720"} Dec 03 14:27:57 crc kubenswrapper[5004]: I1203 14:27:57.700430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerDied","Data":"c07040fe4f2ab205567662214e5cd2b9cbf001a7c57a0ff3781d638b3202230b"} Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.175150 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.185779 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.186525 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-log" containerID="cri-o://7002bfac4aebf05c2adccb4104c283116bdf8849a79cbba847009be002162ca7" gracePeriod=30 Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.186738 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-httpd" containerID="cri-o://554a30263a830941569cbb5d0572bdb44c33ff59ddd7f9891f94c991971a9a25" gracePeriod=30 Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.713937 5004 generic.go:334] "Generic (PLEG): container finished" podID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerID="7002bfac4aebf05c2adccb4104c283116bdf8849a79cbba847009be002162ca7" exitCode=143 Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.714324 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerDied","Data":"7002bfac4aebf05c2adccb4104c283116bdf8849a79cbba847009be002162ca7"} Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.717744 5004 generic.go:334] "Generic (PLEG): container finished" podID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerID="cd2dcac8db020559ba23390885c37b2a2509d530d6deff69030578500225185c" exitCode=0 Dec 03 14:27:58 crc kubenswrapper[5004]: I1203 14:27:58.717787 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerDied","Data":"cd2dcac8db020559ba23390885c37b2a2509d530d6deff69030578500225185c"} Dec 03 14:28:00 crc kubenswrapper[5004]: I1203 14:28:00.718478 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:00 crc kubenswrapper[5004]: I1203 14:28:00.719016 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-log" containerID="cri-o://a6a322c31ab3f0c4562187a886610d52367b6408ee0c2e18e7e58d3f499d53ce" gracePeriod=30 Dec 03 14:28:00 crc kubenswrapper[5004]: I1203 14:28:00.719457 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-httpd" containerID="cri-o://bbfe6de4298b97ef3f233b199b3c250e82ffbff3f19f7ef17d2c2c8f966c7c05" gracePeriod=30 Dec 03 14:28:01 crc kubenswrapper[5004]: I1203 14:28:01.750033 5004 generic.go:334] "Generic (PLEG): container finished" podID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerID="a6a322c31ab3f0c4562187a886610d52367b6408ee0c2e18e7e58d3f499d53ce" exitCode=143 Dec 03 14:28:01 crc kubenswrapper[5004]: I1203 14:28:01.750127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerDied","Data":"a6a322c31ab3f0c4562187a886610d52367b6408ee0c2e18e7e58d3f499d53ce"} Dec 03 14:28:01 crc kubenswrapper[5004]: I1203 14:28:01.755482 5004 generic.go:334] "Generic (PLEG): container finished" podID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerID="554a30263a830941569cbb5d0572bdb44c33ff59ddd7f9891f94c991971a9a25" exitCode=0 Dec 03 14:28:01 crc kubenswrapper[5004]: I1203 14:28:01.755535 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerDied","Data":"554a30263a830941569cbb5d0572bdb44c33ff59ddd7f9891f94c991971a9a25"} Dec 03 14:28:02 crc kubenswrapper[5004]: I1203 14:28:02.539178 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": dial tcp 10.217.0.162:3000: connect: connection refused" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.372107 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.453307 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:28:03 crc kubenswrapper[5004]: W1203 14:28:03.454604 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c9311f_7b12_474a_ba76_c7c534f55e55.slice/crio-4234b6078ba0e893077333ef37d5058a6d29fd03faed1513d88a07f2f8f779ff WatchSource:0}: Error finding container 4234b6078ba0e893077333ef37d5058a6d29fd03faed1513d88a07f2f8f779ff: Status 404 returned error can't find the container with id 4234b6078ba0e893077333ef37d5058a6d29fd03faed1513d88a07f2f8f779ff Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.464534 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b9d87dc5f-trzlj"] Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.469893 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470055 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470158 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jc8j\" (UniqueName: \"kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470229 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470326 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470406 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470482 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9mvj\" (UniqueName: \"kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470555 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470629 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470700 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470765 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470836 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.470959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.471039 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml\") pod \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\" (UID: \"c3b5e86f-ea86-4bfe-bc32-442636f29e80\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.471126 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle\") pod \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\" (UID: \"c2d8cb69-0cf9-4fc7-8834-f850682127d0\") " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.472487 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs" (OuterVolumeSpecName: "logs") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.473313 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.474262 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.486404 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j" (OuterVolumeSpecName: "kube-api-access-8jc8j") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "kube-api-access-8jc8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.486588 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj" (OuterVolumeSpecName: "kube-api-access-x9mvj") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "kube-api-access-x9mvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.488015 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.494708 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts" (OuterVolumeSpecName: "scripts") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.499273 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.504804 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts" (OuterVolumeSpecName: "scripts") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.566840 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.572975 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jc8j\" (UniqueName: \"kubernetes.io/projected/c2d8cb69-0cf9-4fc7-8834-f850682127d0-kube-api-access-8jc8j\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573123 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573204 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573293 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9mvj\" (UniqueName: \"kubernetes.io/projected/c3b5e86f-ea86-4bfe-bc32-442636f29e80-kube-api-access-x9mvj\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573374 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573458 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d8cb69-0cf9-4fc7-8834-f850682127d0-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573528 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573706 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573789 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3b5e86f-ea86-4bfe-bc32-442636f29e80-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.573944 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.592865 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.603064 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.608678 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.618060 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data" (OuterVolumeSpecName: "config-data") pod "c2d8cb69-0cf9-4fc7-8834-f850682127d0" (UID: "c2d8cb69-0cf9-4fc7-8834-f850682127d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.644739 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.675378 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.675414 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.675427 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d8cb69-0cf9-4fc7-8834-f850682127d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.675438 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.675449 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.687489 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data" (OuterVolumeSpecName: "config-data") pod "c3b5e86f-ea86-4bfe-bc32-442636f29e80" (UID: "c3b5e86f-ea86-4bfe-bc32-442636f29e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.775368 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b9d87dc5f-trzlj" event={"ID":"37c9311f-7b12-474a-ba76-c7c534f55e55","Type":"ContainerStarted","Data":"4234b6078ba0e893077333ef37d5058a6d29fd03faed1513d88a07f2f8f779ff"} Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.777046 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b5e86f-ea86-4bfe-bc32-442636f29e80-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.778319 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.778484 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2d8cb69-0cf9-4fc7-8834-f850682127d0","Type":"ContainerDied","Data":"fb216181e3a08602415beff525613ffb1d46709cc8299b1baf3a5796025978a6"} Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.778636 5004 scope.go:117] "RemoveContainer" containerID="554a30263a830941569cbb5d0572bdb44c33ff59ddd7f9891f94c991971a9a25" Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.783707 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3b5e86f-ea86-4bfe-bc32-442636f29e80","Type":"ContainerDied","Data":"3ca56be3dc9d414ce84ec0c2da3b9deae196eb6953274554d1d4e8425fbb4933"} Dec 03 14:28:03 crc kubenswrapper[5004]: I1203 14:28:03.783821 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.034888 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.036727 5004 scope.go:117] "RemoveContainer" containerID="7002bfac4aebf05c2adccb4104c283116bdf8849a79cbba847009be002162ca7" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.056154 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.072506 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:56312->10.217.0.149:9292: read: connection reset by peer" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.072960 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:56326->10.217.0.149:9292: read: connection reset by peer" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.081994 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082445 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="sg-core" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082465 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="sg-core" Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082502 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082510 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082521 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-notification-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082531 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-notification-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082549 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-log" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082560 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-log" Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082576 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082582 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: E1203 14:28:04.082593 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-central-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082599 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-central-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082762 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="sg-core" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082776 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082792 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="proxy-httpd" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082801 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-central-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082816 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" containerName="glance-log" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.082832 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" containerName="ceilometer-notification-agent" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.084009 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.086707 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.086980 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.091707 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.155843 5004 scope.go:117] "RemoveContainer" containerID="09832fabe3456743ce5a7f91860612d763c6153eefaf4ddd5c7bbca6bacf2d16" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.158273 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.181780 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.188563 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.191172 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.194751 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.194988 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.204633 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208172 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208490 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208728 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7rk\" (UniqueName: \"kubernetes.io/projected/e2f95292-1d71-478d-ab12-138e2b34bd3f-kube-api-access-xf7rk\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.208980 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.209152 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.209282 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.257188 5004 scope.go:117] "RemoveContainer" containerID="00044496e8531ccbd44a5638ff8e6d16a0ece6e5fdb86717ee4e2426f2e4c720" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.277696 5004 scope.go:117] "RemoveContainer" containerID="cd2dcac8db020559ba23390885c37b2a2509d530d6deff69030578500225185c" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.303055 5004 scope.go:117] "RemoveContainer" containerID="c07040fe4f2ab205567662214e5cd2b9cbf001a7c57a0ff3781d638b3202230b" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.311409 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313120 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313315 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313397 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313508 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97f8\" (UniqueName: \"kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313526 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313597 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313618 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313740 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313783 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313799 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.313852 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7rk\" (UniqueName: \"kubernetes.io/projected/e2f95292-1d71-478d-ab12-138e2b34bd3f-kube-api-access-xf7rk\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.317405 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-logs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.317471 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2f95292-1d71-478d-ab12-138e2b34bd3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.318006 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.319923 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.320184 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.322278 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.333060 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f95292-1d71-478d-ab12-138e2b34bd3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.335366 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7rk\" (UniqueName: \"kubernetes.io/projected/e2f95292-1d71-478d-ab12-138e2b34bd3f-kube-api-access-xf7rk\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.360725 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e2f95292-1d71-478d-ab12-138e2b34bd3f\") " pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415101 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97f8\" (UniqueName: \"kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415146 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415208 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415304 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.415350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.416739 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.418480 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.419443 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.419768 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.420503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.424814 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.437765 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g97f8\" (UniqueName: \"kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8\") pod \"ceilometer-0\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.458142 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.529929 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.815737 5004 generic.go:334] "Generic (PLEG): container finished" podID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerID="bbfe6de4298b97ef3f233b199b3c250e82ffbff3f19f7ef17d2c2c8f966c7c05" exitCode=0 Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.816343 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerDied","Data":"bbfe6de4298b97ef3f233b199b3c250e82ffbff3f19f7ef17d2c2c8f966c7c05"} Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.827840 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b9d87dc5f-trzlj" event={"ID":"37c9311f-7b12-474a-ba76-c7c534f55e55","Type":"ContainerStarted","Data":"a404c423a790af4d059100c16b3f184e25746d395435eb6eec75e397d3c51650"} Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.838645 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"32ad9006-53eb-4a4f-8ec0-8c287231374e","Type":"ContainerStarted","Data":"a7ae1a02f0b455a0c6a1d8b89a19bda0f26b00992d15f16cdadd4732a8bd816b"} Dec 03 14:28:04 crc kubenswrapper[5004]: I1203 14:28:04.880477 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.4787827399999998 podStartE2EDuration="13.880444738s" podCreationTimestamp="2025-12-03 14:27:51 +0000 UTC" firstStartedPulling="2025-12-03 14:27:51.731435506 +0000 UTC m=+1284.480405752" lastFinishedPulling="2025-12-03 14:28:03.133097514 +0000 UTC m=+1295.882067750" observedRunningTime="2025-12-03 14:28:04.861786455 +0000 UTC m=+1297.610756691" watchObservedRunningTime="2025-12-03 14:28:04.880444738 +0000 UTC m=+1297.629414984" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.157817 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.334421 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434336 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434398 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434431 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434457 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434586 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434651 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.434669 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4wzq\" (UniqueName: \"kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq\") pod \"1690b697-2b2a-4c83-8494-b5e525a414ea\" (UID: \"1690b697-2b2a-4c83-8494-b5e525a414ea\") " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.437949 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs" (OuterVolumeSpecName: "logs") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.439086 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.442998 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.443142 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts" (OuterVolumeSpecName: "scripts") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.446139 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq" (OuterVolumeSpecName: "kube-api-access-q4wzq") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "kube-api-access-q4wzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.495911 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.504070 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data" (OuterVolumeSpecName: "config-data") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.536864 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.536890 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.536955 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.536981 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4wzq\" (UniqueName: \"kubernetes.io/projected/1690b697-2b2a-4c83-8494-b5e525a414ea-kube-api-access-q4wzq\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.537012 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.537020 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.537030 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b697-2b2a-4c83-8494-b5e525a414ea-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.565270 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.579250 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1690b697-2b2a-4c83-8494-b5e525a414ea" (UID: "1690b697-2b2a-4c83-8494-b5e525a414ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.586097 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.638328 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b697-2b2a-4c83-8494-b5e525a414ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.638375 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.655389 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d8cb69-0cf9-4fc7-8834-f850682127d0" path="/var/lib/kubelet/pods/c2d8cb69-0cf9-4fc7-8834-f850682127d0/volumes" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.657851 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b5e86f-ea86-4bfe-bc32-442636f29e80" path="/var/lib/kubelet/pods/c3b5e86f-ea86-4bfe-bc32-442636f29e80/volumes" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.875347 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b9d87dc5f-trzlj" event={"ID":"37c9311f-7b12-474a-ba76-c7c534f55e55","Type":"ContainerStarted","Data":"0bdf4ac9e8efacd9c08d52256fd5dc054e54b1e8999debb5c8b1292abb6be4fc"} Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.876250 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.876337 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.886040 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerStarted","Data":"e440a762a9669503c0ee9e795f62dcbab1e6a01759e3a37f14a51a759727cdd4"} Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.927500 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f95292-1d71-478d-ab12-138e2b34bd3f","Type":"ContainerStarted","Data":"7004313bf2851758dfa5ab10a384bc82dbf6965a7e7b5c8c9be5adaf066c93c5"} Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.933394 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b9d87dc5f-trzlj" podStartSLOduration=10.933370314 podStartE2EDuration="10.933370314s" podCreationTimestamp="2025-12-03 14:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:05.927327983 +0000 UTC m=+1298.676298229" watchObservedRunningTime="2025-12-03 14:28:05.933370314 +0000 UTC m=+1298.682340550" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.962945 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b697-2b2a-4c83-8494-b5e525a414ea","Type":"ContainerDied","Data":"3ccf3f197beda4a197c54d9872c91d7a8a245e26aac4528a75dfc9dcd52116cc"} Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.963014 5004 scope.go:117] "RemoveContainer" containerID="bbfe6de4298b97ef3f233b199b3c250e82ffbff3f19f7ef17d2c2c8f966c7c05" Dec 03 14:28:05 crc kubenswrapper[5004]: I1203 14:28:05.963705 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.026981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.049398 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.072341 5004 scope.go:117] "RemoveContainer" containerID="a6a322c31ab3f0c4562187a886610d52367b6408ee0c2e18e7e58d3f499d53ce" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.129381 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:06 crc kubenswrapper[5004]: E1203 14:28:06.130238 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-httpd" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.130257 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-httpd" Dec 03 14:28:06 crc kubenswrapper[5004]: E1203 14:28:06.130287 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-log" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.130296 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-log" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.130553 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-log" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.130568 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" containerName="glance-httpd" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.131513 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.135706 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.135993 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.194728 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270697 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270762 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqzc\" (UniqueName: \"kubernetes.io/projected/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-kube-api-access-6sqzc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270815 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270838 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.270976 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.271036 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.271084 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.374449 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.374727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqzc\" (UniqueName: \"kubernetes.io/projected/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-kube-api-access-6sqzc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.374820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.374900 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.374994 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.375096 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.375178 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.375255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.375573 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-logs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.375999 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.381377 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.383463 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.387571 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.388222 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.388966 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.394210 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqzc\" (UniqueName: \"kubernetes.io/projected/3959efd9-7c4e-43b5-b73a-9b05ec3fb59c-kube-api-access-6sqzc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.416451 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c\") " pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.581567 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.977212 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerStarted","Data":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} Dec 03 14:28:06 crc kubenswrapper[5004]: I1203 14:28:06.980940 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f95292-1d71-478d-ab12-138e2b34bd3f","Type":"ContainerStarted","Data":"530072da532b90200c625705896070ee0838add3db25c84dce72c01033b9ccf5"} Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.130586 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.628536 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1690b697-2b2a-4c83-8494-b5e525a414ea" path="/var/lib/kubelet/pods/1690b697-2b2a-4c83-8494-b5e525a414ea/volumes" Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.863234 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gl6js"] Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.869190 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.876909 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gl6js"] Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.978919 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pwhkz"] Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.980944 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.989513 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a3f9-account-create-update-j96b4"] Dec 03 14:28:07 crc kubenswrapper[5004]: I1203 14:28:07.991161 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.000110 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.000420 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.013027 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a3f9-account-create-update-j96b4"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.018173 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2f95292-1d71-478d-ab12-138e2b34bd3f","Type":"ContainerStarted","Data":"dc66b5a8aa6adb1c2613d7de6d79282ef8284ba5d111975c0bc36c7f225d6d9a"} Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.020428 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4txs\" (UniqueName: \"kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.020656 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.024280 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pwhkz"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.029686 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c","Type":"ContainerStarted","Data":"b9ad371f326695916b0852f20fe37d181f791bb3df47810960bb8dbfe614eee2"} Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.029818 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c","Type":"ContainerStarted","Data":"8de1fc13e2dc34bcb02153325848e1177c6ce5ea9ce6914cec37712319f5d1f4"} Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.037702 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerStarted","Data":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.098243 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.098225609 podStartE2EDuration="4.098225609s" podCreationTimestamp="2025-12-03 14:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:08.066661835 +0000 UTC m=+1300.815632071" watchObservedRunningTime="2025-12-03 14:28:08.098225609 +0000 UTC m=+1300.847195845" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.131149 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.131498 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzhr\" (UniqueName: \"kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.134336 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4txs\" (UniqueName: \"kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.134412 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.134520 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.134923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmg7j\" (UniqueName: \"kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.139224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.175575 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-587bd47d68-c6stc" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.175701 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.177763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4txs\" (UniqueName: \"kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs\") pod \"nova-api-db-create-gl6js\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.179678 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-g7df6"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.181505 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.208384 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.216003 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4b76-account-create-update-v5798"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.217322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.219223 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.238814 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzhr\" (UniqueName: \"kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.238908 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.238980 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmg7j\" (UniqueName: \"kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.239009 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.239694 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.240525 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.258491 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmg7j\" (UniqueName: \"kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j\") pod \"nova-cell0-db-create-pwhkz\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.263524 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzhr\" (UniqueName: \"kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr\") pod \"nova-api-a3f9-account-create-update-j96b4\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.267493 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4b76-account-create-update-v5798"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.304598 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-g7df6"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.344166 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.344234 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznlh\" (UniqueName: \"kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.344328 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tczp6\" (UniqueName: \"kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.344628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.407592 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6611-account-create-update-vs8jg"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.409272 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.412133 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.421409 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6611-account-create-update-vs8jg"] Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.446487 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.446719 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.446761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shh5\" (UniqueName: \"kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.446814 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.446846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznlh\" (UniqueName: \"kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.447102 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tczp6\" (UniqueName: \"kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.448453 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.449126 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.466830 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tczp6\" (UniqueName: \"kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6\") pod \"nova-cell0-4b76-account-create-update-v5798\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.473794 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznlh\" (UniqueName: \"kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh\") pod \"nova-cell1-db-create-g7df6\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.549246 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shh5\" (UniqueName: \"kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.549668 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.549702 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.550369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.562670 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.568391 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shh5\" (UniqueName: \"kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5\") pod \"nova-cell1-6611-account-create-update-vs8jg\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.575786 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.608281 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.753221 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:08 crc kubenswrapper[5004]: I1203 14:28:08.925808 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gl6js"] Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.076936 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gl6js" event={"ID":"ed2587cd-e48f-400d-b782-04f2c573862a","Type":"ContainerStarted","Data":"671a9e5029acb95362e869cf2abadc43bf1b1f105c58bd32cd776b6fa039686a"} Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.559636 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4b76-account-create-update-v5798"] Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.568227 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pwhkz"] Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.579073 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-g7df6"] Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.589602 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6611-account-create-update-vs8jg"] Dec 03 14:28:09 crc kubenswrapper[5004]: W1203 14:28:09.601091 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b84bb98_6817_47a4_8d88_dfeb6a19e195.slice/crio-654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba WatchSource:0}: Error finding container 654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba: Status 404 returned error can't find the container with id 654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba Dec 03 14:28:09 crc kubenswrapper[5004]: I1203 14:28:09.601317 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a3f9-account-create-update-j96b4"] Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.088522 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-g7df6" event={"ID":"9b84bb98-6817-47a4-8d88-dfeb6a19e195","Type":"ContainerStarted","Data":"fafc18bfee759b2a197049db6386283346b941a560625fb31f1fc8819e9d4025"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.090020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-g7df6" event={"ID":"9b84bb98-6817-47a4-8d88-dfeb6a19e195","Type":"ContainerStarted","Data":"654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.096104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3959efd9-7c4e-43b5-b73a-9b05ec3fb59c","Type":"ContainerStarted","Data":"0cffe88111d299cd2a5b62e200ae323e4c70c28fa08563505ce7e03585a43799"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.106249 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerStarted","Data":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.118433 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-g7df6" podStartSLOduration=2.118407947 podStartE2EDuration="2.118407947s" podCreationTimestamp="2025-12-03 14:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.107536129 +0000 UTC m=+1302.856506365" watchObservedRunningTime="2025-12-03 14:28:10.118407947 +0000 UTC m=+1302.867378183" Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.121760 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pwhkz" event={"ID":"9df3e0e7-8b8b-4f47-8b19-40afbab582d6","Type":"ContainerStarted","Data":"b7fde29958f276d04f31bea0fde651a061cd6328195f0f202053bc8f3874f3f1"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.121831 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pwhkz" event={"ID":"9df3e0e7-8b8b-4f47-8b19-40afbab582d6","Type":"ContainerStarted","Data":"ac7a645c33377068ddf6ce4f97762656dd2506634224887fa2f212e4406b37a7"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.125795 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4b76-account-create-update-v5798" event={"ID":"804a1196-3d3c-4e15-8a2c-3ae12a943249","Type":"ContainerStarted","Data":"b58f8a8ee5cb386d03a7e711fe2171ccb92f54f4232d8b5ee394f908b718522d"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.125977 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4b76-account-create-update-v5798" event={"ID":"804a1196-3d3c-4e15-8a2c-3ae12a943249","Type":"ContainerStarted","Data":"aa776eca12008b76487d573ab7ced14c65f69dad76e53b8de8b5f2b0da3cdff5"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.129193 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" event={"ID":"16e053df-7662-47f4-bd6d-ed3f75ca1901","Type":"ContainerStarted","Data":"0e467b3d99b383fffebd71c6ff56d045e9ac3e107a0c125f3f1d4893728253f5"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.129271 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" event={"ID":"16e053df-7662-47f4-bd6d-ed3f75ca1901","Type":"ContainerStarted","Data":"7bb0c69dd70915fcc2fd4cabb9cb9f78a6c155d9d9b487725860c5d19ee1c71d"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.143659 5004 generic.go:334] "Generic (PLEG): container finished" podID="ed2587cd-e48f-400d-b782-04f2c573862a" containerID="0957ac13149aa15a69ae904702bc69f4ed5ec77d67e963fca236bd6c4226da6c" exitCode=0 Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.143772 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gl6js" event={"ID":"ed2587cd-e48f-400d-b782-04f2c573862a","Type":"ContainerDied","Data":"0957ac13149aa15a69ae904702bc69f4ed5ec77d67e963fca236bd6c4226da6c"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.144104 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.144078465 podStartE2EDuration="4.144078465s" podCreationTimestamp="2025-12-03 14:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.135177012 +0000 UTC m=+1302.884147248" watchObservedRunningTime="2025-12-03 14:28:10.144078465 +0000 UTC m=+1302.893048711" Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.148328 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a3f9-account-create-update-j96b4" event={"ID":"57216442-7799-4751-8116-ba7d842d4be9","Type":"ContainerStarted","Data":"d9e43b9bf20d31643d65208e19b5996b5a3a0a10177f52588dc76f0f50ee65bb"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.148385 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a3f9-account-create-update-j96b4" event={"ID":"57216442-7799-4751-8116-ba7d842d4be9","Type":"ContainerStarted","Data":"6c0fd9692d30ae16f6f5cc79d4507dec5fc745043bb4e0a86022ad66c9b09950"} Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.163746 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pwhkz" podStartSLOduration=3.163723691 podStartE2EDuration="3.163723691s" podCreationTimestamp="2025-12-03 14:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.158349149 +0000 UTC m=+1302.907319385" watchObservedRunningTime="2025-12-03 14:28:10.163723691 +0000 UTC m=+1302.912693937" Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.179961 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" podStartSLOduration=2.179946151 podStartE2EDuration="2.179946151s" podCreationTimestamp="2025-12-03 14:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.173769116 +0000 UTC m=+1302.922739352" watchObservedRunningTime="2025-12-03 14:28:10.179946151 +0000 UTC m=+1302.928916387" Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.202370 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4b76-account-create-update-v5798" podStartSLOduration=2.202345985 podStartE2EDuration="2.202345985s" podCreationTimestamp="2025-12-03 14:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.190778948 +0000 UTC m=+1302.939749184" watchObservedRunningTime="2025-12-03 14:28:10.202345985 +0000 UTC m=+1302.951316221" Dec 03 14:28:10 crc kubenswrapper[5004]: I1203 14:28:10.216504 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-a3f9-account-create-update-j96b4" podStartSLOduration=3.216487376 podStartE2EDuration="3.216487376s" podCreationTimestamp="2025-12-03 14:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:10.206575425 +0000 UTC m=+1302.955545671" watchObservedRunningTime="2025-12-03 14:28:10.216487376 +0000 UTC m=+1302.965457632" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.159113 5004 generic.go:334] "Generic (PLEG): container finished" podID="57216442-7799-4751-8116-ba7d842d4be9" containerID="d9e43b9bf20d31643d65208e19b5996b5a3a0a10177f52588dc76f0f50ee65bb" exitCode=0 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.159205 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a3f9-account-create-update-j96b4" event={"ID":"57216442-7799-4751-8116-ba7d842d4be9","Type":"ContainerDied","Data":"d9e43b9bf20d31643d65208e19b5996b5a3a0a10177f52588dc76f0f50ee65bb"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.161316 5004 generic.go:334] "Generic (PLEG): container finished" podID="9b84bb98-6817-47a4-8d88-dfeb6a19e195" containerID="fafc18bfee759b2a197049db6386283346b941a560625fb31f1fc8819e9d4025" exitCode=0 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.161355 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-g7df6" event={"ID":"9b84bb98-6817-47a4-8d88-dfeb6a19e195","Type":"ContainerDied","Data":"fafc18bfee759b2a197049db6386283346b941a560625fb31f1fc8819e9d4025"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164082 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerStarted","Data":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164233 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-central-agent" containerID="cri-o://3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" gracePeriod=30 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164279 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164312 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="sg-core" containerID="cri-o://eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" gracePeriod=30 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164328 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="proxy-httpd" containerID="cri-o://1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" gracePeriod=30 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.164343 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-notification-agent" containerID="cri-o://6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" gracePeriod=30 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.171640 5004 generic.go:334] "Generic (PLEG): container finished" podID="9df3e0e7-8b8b-4f47-8b19-40afbab582d6" containerID="b7fde29958f276d04f31bea0fde651a061cd6328195f0f202053bc8f3874f3f1" exitCode=0 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.171766 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pwhkz" event={"ID":"9df3e0e7-8b8b-4f47-8b19-40afbab582d6","Type":"ContainerDied","Data":"b7fde29958f276d04f31bea0fde651a061cd6328195f0f202053bc8f3874f3f1"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.176997 5004 generic.go:334] "Generic (PLEG): container finished" podID="804a1196-3d3c-4e15-8a2c-3ae12a943249" containerID="b58f8a8ee5cb386d03a7e711fe2171ccb92f54f4232d8b5ee394f908b718522d" exitCode=0 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.177081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4b76-account-create-update-v5798" event={"ID":"804a1196-3d3c-4e15-8a2c-3ae12a943249","Type":"ContainerDied","Data":"b58f8a8ee5cb386d03a7e711fe2171ccb92f54f4232d8b5ee394f908b718522d"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.181513 5004 generic.go:334] "Generic (PLEG): container finished" podID="16e053df-7662-47f4-bd6d-ed3f75ca1901" containerID="0e467b3d99b383fffebd71c6ff56d045e9ac3e107a0c125f3f1d4893728253f5" exitCode=0 Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.181811 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" event={"ID":"16e053df-7662-47f4-bd6d-ed3f75ca1901","Type":"ContainerDied","Data":"0e467b3d99b383fffebd71c6ff56d045e9ac3e107a0c125f3f1d4893728253f5"} Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.218945 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.989562775 podStartE2EDuration="7.218924802s" podCreationTimestamp="2025-12-03 14:28:04 +0000 UTC" firstStartedPulling="2025-12-03 14:28:05.166722046 +0000 UTC m=+1297.915692282" lastFinishedPulling="2025-12-03 14:28:10.396084073 +0000 UTC m=+1303.145054309" observedRunningTime="2025-12-03 14:28:11.205624825 +0000 UTC m=+1303.954595091" watchObservedRunningTime="2025-12-03 14:28:11.218924802 +0000 UTC m=+1303.967895058" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.266842 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.272404 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b9d87dc5f-trzlj" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.703104 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.844993 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts\") pod \"ed2587cd-e48f-400d-b782-04f2c573862a\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.845058 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4txs\" (UniqueName: \"kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs\") pod \"ed2587cd-e48f-400d-b782-04f2c573862a\" (UID: \"ed2587cd-e48f-400d-b782-04f2c573862a\") " Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.845500 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed2587cd-e48f-400d-b782-04f2c573862a" (UID: "ed2587cd-e48f-400d-b782-04f2c573862a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.845833 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed2587cd-e48f-400d-b782-04f2c573862a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.850788 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs" (OuterVolumeSpecName: "kube-api-access-x4txs") pod "ed2587cd-e48f-400d-b782-04f2c573862a" (UID: "ed2587cd-e48f-400d-b782-04f2c573862a"). InnerVolumeSpecName "kube-api-access-x4txs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.947496 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4txs\" (UniqueName: \"kubernetes.io/projected/ed2587cd-e48f-400d-b782-04f2c573862a-kube-api-access-x4txs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:11 crc kubenswrapper[5004]: I1203 14:28:11.995339 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150594 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g97f8\" (UniqueName: \"kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150644 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150713 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150730 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150775 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150840 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.150906 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle\") pod \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\" (UID: \"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.152240 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.152421 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.156058 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8" (OuterVolumeSpecName: "kube-api-access-g97f8") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "kube-api-access-g97f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.162053 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts" (OuterVolumeSpecName: "scripts") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.182561 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.192710 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gl6js" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.192698 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gl6js" event={"ID":"ed2587cd-e48f-400d-b782-04f2c573862a","Type":"ContainerDied","Data":"671a9e5029acb95362e869cf2abadc43bf1b1f105c58bd32cd776b6fa039686a"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.193107 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671a9e5029acb95362e869cf2abadc43bf1b1f105c58bd32cd776b6fa039686a" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196377 5004 generic.go:334] "Generic (PLEG): container finished" podID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" exitCode=0 Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196412 5004 generic.go:334] "Generic (PLEG): container finished" podID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" exitCode=2 Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196424 5004 generic.go:334] "Generic (PLEG): container finished" podID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" exitCode=0 Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196432 5004 generic.go:334] "Generic (PLEG): container finished" podID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" exitCode=0 Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196438 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196470 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerDied","Data":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196494 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerDied","Data":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerDied","Data":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196530 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerDied","Data":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a","Type":"ContainerDied","Data":"e440a762a9669503c0ee9e795f62dcbab1e6a01759e3a37f14a51a759727cdd4"} Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.196560 5004 scope.go:117] "RemoveContainer" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.242405 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253123 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g97f8\" (UniqueName: \"kubernetes.io/projected/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-kube-api-access-g97f8\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253174 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253183 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253193 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253203 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.253215 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.257791 5004 scope.go:117] "RemoveContainer" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.282060 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data" (OuterVolumeSpecName: "config-data") pod "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" (UID: "c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.292818 5004 scope.go:117] "RemoveContainer" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.354939 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.388007 5004 scope.go:117] "RemoveContainer" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.412921 5004 scope.go:117] "RemoveContainer" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.413908 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": container with ID starting with 1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578 not found: ID does not exist" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.413969 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} err="failed to get container status \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": rpc error: code = NotFound desc = could not find container \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": container with ID starting with 1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.414002 5004 scope.go:117] "RemoveContainer" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.414482 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": container with ID starting with eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32 not found: ID does not exist" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.414534 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} err="failed to get container status \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": rpc error: code = NotFound desc = could not find container \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": container with ID starting with eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.414560 5004 scope.go:117] "RemoveContainer" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.414931 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": container with ID starting with 6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26 not found: ID does not exist" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.414974 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} err="failed to get container status \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": rpc error: code = NotFound desc = could not find container \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": container with ID starting with 6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415001 5004 scope.go:117] "RemoveContainer" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.415239 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": container with ID starting with 3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de not found: ID does not exist" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415261 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} err="failed to get container status \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": rpc error: code = NotFound desc = could not find container \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": container with ID starting with 3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415275 5004 scope.go:117] "RemoveContainer" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415607 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} err="failed to get container status \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": rpc error: code = NotFound desc = could not find container \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": container with ID starting with 1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415628 5004 scope.go:117] "RemoveContainer" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415880 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} err="failed to get container status \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": rpc error: code = NotFound desc = could not find container \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": container with ID starting with eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.415896 5004 scope.go:117] "RemoveContainer" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.416092 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} err="failed to get container status \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": rpc error: code = NotFound desc = could not find container \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": container with ID starting with 6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.416107 5004 scope.go:117] "RemoveContainer" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.416785 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} err="failed to get container status \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": rpc error: code = NotFound desc = could not find container \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": container with ID starting with 3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.416805 5004 scope.go:117] "RemoveContainer" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417242 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} err="failed to get container status \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": rpc error: code = NotFound desc = could not find container \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": container with ID starting with 1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417277 5004 scope.go:117] "RemoveContainer" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417486 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} err="failed to get container status \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": rpc error: code = NotFound desc = could not find container \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": container with ID starting with eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417508 5004 scope.go:117] "RemoveContainer" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417705 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} err="failed to get container status \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": rpc error: code = NotFound desc = could not find container \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": container with ID starting with 6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.417731 5004 scope.go:117] "RemoveContainer" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.418008 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} err="failed to get container status \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": rpc error: code = NotFound desc = could not find container \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": container with ID starting with 3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.418024 5004 scope.go:117] "RemoveContainer" containerID="1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.418957 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578"} err="failed to get container status \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": rpc error: code = NotFound desc = could not find container \"1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578\": container with ID starting with 1f0b784e3218e26dfd8c939ded2c4988e5851c26761b563701b26ac8143ff578 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.418977 5004 scope.go:117] "RemoveContainer" containerID="eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.419262 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32"} err="failed to get container status \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": rpc error: code = NotFound desc = could not find container \"eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32\": container with ID starting with eccd15d9ac8ef5e35a32df7795cdbc50fb3f46032004c00153b07e60c0a07b32 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.419281 5004 scope.go:117] "RemoveContainer" containerID="6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.419573 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26"} err="failed to get container status \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": rpc error: code = NotFound desc = could not find container \"6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26\": container with ID starting with 6ef6d02f58ca448f7a3d4e6dcfb6e53bb2315017e163354c4061e9510934cf26 not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.419614 5004 scope.go:117] "RemoveContainer" containerID="3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.419923 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de"} err="failed to get container status \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": rpc error: code = NotFound desc = could not find container \"3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de\": container with ID starting with 3363d0f35f19a982e176bd5531f556e4fb783cb96694a945946639e70c14b0de not found: ID does not exist" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.531144 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.538709 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.559886 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.563811 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="proxy-httpd" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.563949 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="proxy-httpd" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.563965 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="sg-core" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.563973 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="sg-core" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.563984 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-notification-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.563992 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-notification-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.564018 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2587cd-e48f-400d-b782-04f2c573862a" containerName="mariadb-database-create" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564026 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2587cd-e48f-400d-b782-04f2c573862a" containerName="mariadb-database-create" Dec 03 14:28:12 crc kubenswrapper[5004]: E1203 14:28:12.564049 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-central-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564056 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-central-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564315 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-notification-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564327 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2587cd-e48f-400d-b782-04f2c573862a" containerName="mariadb-database-create" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564339 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="ceilometer-central-agent" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564354 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="proxy-httpd" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.564362 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" containerName="sg-core" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.568038 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.576305 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.576636 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.578789 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.700614 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766224 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766290 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766364 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd257\" (UniqueName: \"kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766383 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.766401 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.816052 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.830926 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.846682 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.867402 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.869622 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzhr\" (UniqueName: \"kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr\") pod \"57216442-7799-4751-8116-ba7d842d4be9\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.869744 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts\") pod \"57216442-7799-4751-8116-ba7d842d4be9\" (UID: \"57216442-7799-4751-8116-ba7d842d4be9\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870096 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870142 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870167 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870236 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd257\" (UniqueName: \"kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870257 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870405 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.870871 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.876312 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57216442-7799-4751-8116-ba7d842d4be9" (UID: "57216442-7799-4751-8116-ba7d842d4be9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.876333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.877827 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.878101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.879472 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.881988 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr" (OuterVolumeSpecName: "kube-api-access-gzzhr") pod "57216442-7799-4751-8116-ba7d842d4be9" (UID: "57216442-7799-4751-8116-ba7d842d4be9"). InnerVolumeSpecName "kube-api-access-gzzhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.885205 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.912891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd257\" (UniqueName: \"kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257\") pod \"ceilometer-0\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " pod="openstack/ceilometer-0" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973442 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hznlh\" (UniqueName: \"kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh\") pod \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973535 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts\") pod \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\" (UID: \"9b84bb98-6817-47a4-8d88-dfeb6a19e195\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973650 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts\") pod \"16e053df-7662-47f4-bd6d-ed3f75ca1901\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973684 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts\") pod \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973712 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shh5\" (UniqueName: \"kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5\") pod \"16e053df-7662-47f4-bd6d-ed3f75ca1901\" (UID: \"16e053df-7662-47f4-bd6d-ed3f75ca1901\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973807 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmg7j\" (UniqueName: \"kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j\") pod \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\" (UID: \"9df3e0e7-8b8b-4f47-8b19-40afbab582d6\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973830 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts\") pod \"804a1196-3d3c-4e15-8a2c-3ae12a943249\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.973847 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tczp6\" (UniqueName: \"kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6\") pod \"804a1196-3d3c-4e15-8a2c-3ae12a943249\" (UID: \"804a1196-3d3c-4e15-8a2c-3ae12a943249\") " Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.974316 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzhr\" (UniqueName: \"kubernetes.io/projected/57216442-7799-4751-8116-ba7d842d4be9-kube-api-access-gzzhr\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.974335 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57216442-7799-4751-8116-ba7d842d4be9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.974694 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b84bb98-6817-47a4-8d88-dfeb6a19e195" (UID: "9b84bb98-6817-47a4-8d88-dfeb6a19e195"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.975399 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9df3e0e7-8b8b-4f47-8b19-40afbab582d6" (UID: "9df3e0e7-8b8b-4f47-8b19-40afbab582d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.975825 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16e053df-7662-47f4-bd6d-ed3f75ca1901" (UID: "16e053df-7662-47f4-bd6d-ed3f75ca1901"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.977293 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "804a1196-3d3c-4e15-8a2c-3ae12a943249" (UID: "804a1196-3d3c-4e15-8a2c-3ae12a943249"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.990103 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6" (OuterVolumeSpecName: "kube-api-access-tczp6") pod "804a1196-3d3c-4e15-8a2c-3ae12a943249" (UID: "804a1196-3d3c-4e15-8a2c-3ae12a943249"). InnerVolumeSpecName "kube-api-access-tczp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.995222 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5" (OuterVolumeSpecName: "kube-api-access-8shh5") pod "16e053df-7662-47f4-bd6d-ed3f75ca1901" (UID: "16e053df-7662-47f4-bd6d-ed3f75ca1901"). InnerVolumeSpecName "kube-api-access-8shh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:12 crc kubenswrapper[5004]: I1203 14:28:12.995326 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh" (OuterVolumeSpecName: "kube-api-access-hznlh") pod "9b84bb98-6817-47a4-8d88-dfeb6a19e195" (UID: "9b84bb98-6817-47a4-8d88-dfeb6a19e195"). InnerVolumeSpecName "kube-api-access-hznlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.004321 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j" (OuterVolumeSpecName: "kube-api-access-vmg7j") pod "9df3e0e7-8b8b-4f47-8b19-40afbab582d6" (UID: "9df3e0e7-8b8b-4f47-8b19-40afbab582d6"). InnerVolumeSpecName "kube-api-access-vmg7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075908 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/804a1196-3d3c-4e15-8a2c-3ae12a943249-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075948 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tczp6\" (UniqueName: \"kubernetes.io/projected/804a1196-3d3c-4e15-8a2c-3ae12a943249-kube-api-access-tczp6\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075963 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hznlh\" (UniqueName: \"kubernetes.io/projected/9b84bb98-6817-47a4-8d88-dfeb6a19e195-kube-api-access-hznlh\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075975 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b84bb98-6817-47a4-8d88-dfeb6a19e195-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075987 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16e053df-7662-47f4-bd6d-ed3f75ca1901-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.075998 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.076009 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shh5\" (UniqueName: \"kubernetes.io/projected/16e053df-7662-47f4-bd6d-ed3f75ca1901-kube-api-access-8shh5\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.076017 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmg7j\" (UniqueName: \"kubernetes.io/projected/9df3e0e7-8b8b-4f47-8b19-40afbab582d6-kube-api-access-vmg7j\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.190549 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.208532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a3f9-account-create-update-j96b4" event={"ID":"57216442-7799-4751-8116-ba7d842d4be9","Type":"ContainerDied","Data":"6c0fd9692d30ae16f6f5cc79d4507dec5fc745043bb4e0a86022ad66c9b09950"} Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.208578 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0fd9692d30ae16f6f5cc79d4507dec5fc745043bb4e0a86022ad66c9b09950" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.208660 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a3f9-account-create-update-j96b4" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.210791 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-g7df6" event={"ID":"9b84bb98-6817-47a4-8d88-dfeb6a19e195","Type":"ContainerDied","Data":"654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba"} Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.210924 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654df10ab14e2832a7d987f57522ec5e47a82b55daad7a2d7cef401743fb3dba" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.210963 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-g7df6" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.216580 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pwhkz" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.216575 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pwhkz" event={"ID":"9df3e0e7-8b8b-4f47-8b19-40afbab582d6","Type":"ContainerDied","Data":"ac7a645c33377068ddf6ce4f97762656dd2506634224887fa2f212e4406b37a7"} Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.216765 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7a645c33377068ddf6ce4f97762656dd2506634224887fa2f212e4406b37a7" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.218310 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4b76-account-create-update-v5798" event={"ID":"804a1196-3d3c-4e15-8a2c-3ae12a943249","Type":"ContainerDied","Data":"aa776eca12008b76487d573ab7ced14c65f69dad76e53b8de8b5f2b0da3cdff5"} Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.218362 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa776eca12008b76487d573ab7ced14c65f69dad76e53b8de8b5f2b0da3cdff5" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.218422 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4b76-account-create-update-v5798" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.226264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" event={"ID":"16e053df-7662-47f4-bd6d-ed3f75ca1901","Type":"ContainerDied","Data":"7bb0c69dd70915fcc2fd4cabb9cb9f78a6c155d9d9b487725860c5d19ee1c71d"} Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.226965 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb0c69dd70915fcc2fd4cabb9cb9f78a6c155d9d9b487725860c5d19ee1c71d" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.227185 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6611-account-create-update-vs8jg" Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.626075 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a" path="/var/lib/kubelet/pods/c5fb8a3c-4989-4d2c-acc9-50f0da3bc47a/volumes" Dec 03 14:28:13 crc kubenswrapper[5004]: W1203 14:28:13.659068 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb4fe63a_54e2_4407_aaa5_45b394d0d460.slice/crio-afef6a8786cd175846eb520b7dba9b20ee3cf6234e1b598cbc9bd92724cc8e1a WatchSource:0}: Error finding container afef6a8786cd175846eb520b7dba9b20ee3cf6234e1b598cbc9bd92724cc8e1a: Status 404 returned error can't find the container with id afef6a8786cd175846eb520b7dba9b20ee3cf6234e1b598cbc9bd92724cc8e1a Dec 03 14:28:13 crc kubenswrapper[5004]: I1203 14:28:13.659146 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.247728 5004 generic.go:334] "Generic (PLEG): container finished" podID="962799d4-1cef-40f7-a1d8-e4231680a856" containerID="170abb84be82d97c90bb1e36346b7122190618e65add8272765ea09c84996d53" exitCode=137 Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.247922 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerDied","Data":"170abb84be82d97c90bb1e36346b7122190618e65add8272765ea09c84996d53"} Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.249601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerStarted","Data":"afef6a8786cd175846eb520b7dba9b20ee3cf6234e1b598cbc9bd92724cc8e1a"} Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.459022 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.459065 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.503897 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.511585 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.512495 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.605607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606128 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606182 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22lzn\" (UniqueName: \"kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606219 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606307 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.606390 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs\") pod \"962799d4-1cef-40f7-a1d8-e4231680a856\" (UID: \"962799d4-1cef-40f7-a1d8-e4231680a856\") " Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.607587 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs" (OuterVolumeSpecName: "logs") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.621072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn" (OuterVolumeSpecName: "kube-api-access-22lzn") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "kube-api-access-22lzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.648421 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.649955 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data" (OuterVolumeSpecName: "config-data") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.657538 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts" (OuterVolumeSpecName: "scripts") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.662926 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.680375 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "962799d4-1cef-40f7-a1d8-e4231680a856" (UID: "962799d4-1cef-40f7-a1d8-e4231680a856"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.708931 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22lzn\" (UniqueName: \"kubernetes.io/projected/962799d4-1cef-40f7-a1d8-e4231680a856-kube-api-access-22lzn\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.708962 5004 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.708990 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.709003 5004 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.709012 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962799d4-1cef-40f7-a1d8-e4231680a856-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.709020 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962799d4-1cef-40f7-a1d8-e4231680a856-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:14 crc kubenswrapper[5004]: I1203 14:28:14.709027 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/962799d4-1cef-40f7-a1d8-e4231680a856-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.261687 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerStarted","Data":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.263045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerStarted","Data":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.264605 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-587bd47d68-c6stc" event={"ID":"962799d4-1cef-40f7-a1d8-e4231680a856","Type":"ContainerDied","Data":"c8f8d88fd60e9feec438572e394893a90bc99976e8ad2b5a4c5ac9a1d6c15b74"} Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.264650 5004 scope.go:117] "RemoveContainer" containerID="85251560e3d53350eb3d01e94ef00b8672f076f3cfd3f4a9681e204313af876e" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.264656 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-587bd47d68-c6stc" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.264840 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.264899 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.317047 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.333801 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-587bd47d68-c6stc"] Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.477807 5004 scope.go:117] "RemoveContainer" containerID="170abb84be82d97c90bb1e36346b7122190618e65add8272765ea09c84996d53" Dec 03 14:28:15 crc kubenswrapper[5004]: I1203 14:28:15.625014 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" path="/var/lib/kubelet/pods/962799d4-1cef-40f7-a1d8-e4231680a856/volumes" Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.277105 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerStarted","Data":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.583884 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.587011 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.587287 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.660774 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:16 crc kubenswrapper[5004]: I1203 14:28:16.674361 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:17 crc kubenswrapper[5004]: I1203 14:28:17.291020 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:17 crc kubenswrapper[5004]: I1203 14:28:17.291524 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:17 crc kubenswrapper[5004]: I1203 14:28:17.763938 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:28:17 crc kubenswrapper[5004]: I1203 14:28:17.764057 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:28:17 crc kubenswrapper[5004]: I1203 14:28:17.803906 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.302446 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-central-agent" containerID="cri-o://ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" gracePeriod=30 Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.302804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerStarted","Data":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.303416 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="sg-core" containerID="cri-o://d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" gracePeriod=30 Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.303508 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="proxy-httpd" containerID="cri-o://5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" gracePeriod=30 Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.303571 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-notification-agent" containerID="cri-o://6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" gracePeriod=30 Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.303694 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:28:18 crc kubenswrapper[5004]: I1203 14:28:18.324989 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.874860608 podStartE2EDuration="6.324970602s" podCreationTimestamp="2025-12-03 14:28:12 +0000 UTC" firstStartedPulling="2025-12-03 14:28:13.661672701 +0000 UTC m=+1306.410642937" lastFinishedPulling="2025-12-03 14:28:17.111782695 +0000 UTC m=+1309.860752931" observedRunningTime="2025-12-03 14:28:18.323563322 +0000 UTC m=+1311.072533578" watchObservedRunningTime="2025-12-03 14:28:18.324970602 +0000 UTC m=+1311.073940858" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.558450 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9twz"] Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559218 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df3e0e7-8b8b-4f47-8b19-40afbab582d6" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559237 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df3e0e7-8b8b-4f47-8b19-40afbab582d6" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559251 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon-log" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559260 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon-log" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559282 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84bb98-6817-47a4-8d88-dfeb6a19e195" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559292 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84bb98-6817-47a4-8d88-dfeb6a19e195" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559325 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e053df-7662-47f4-bd6d-ed3f75ca1901" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559333 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e053df-7662-47f4-bd6d-ed3f75ca1901" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559347 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804a1196-3d3c-4e15-8a2c-3ae12a943249" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559356 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a1196-3d3c-4e15-8a2c-3ae12a943249" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559368 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57216442-7799-4751-8116-ba7d842d4be9" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559376 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="57216442-7799-4751-8116-ba7d842d4be9" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: E1203 14:28:19.559388 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559396 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559600 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df3e0e7-8b8b-4f47-8b19-40afbab582d6" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559616 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559633 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="962799d4-1cef-40f7-a1d8-e4231680a856" containerName="horizon-log" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559651 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84bb98-6817-47a4-8d88-dfeb6a19e195" containerName="mariadb-database-create" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559667 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e053df-7662-47f4-bd6d-ed3f75ca1901" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559683 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="804a1196-3d3c-4e15-8a2c-3ae12a943249" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.559694 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="57216442-7799-4751-8116-ba7d842d4be9" containerName="mariadb-account-create-update" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.560467 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.563731 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.564123 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kr7qd" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.564328 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.568404 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9twz"] Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.624723 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.624797 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.624899 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjd6g\" (UniqueName: \"kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.625240 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.726969 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.727070 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.727102 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.727207 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjd6g\" (UniqueName: \"kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.733470 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.737432 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.745534 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.747101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjd6g\" (UniqueName: \"kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g\") pod \"nova-cell0-conductor-db-sync-t9twz\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:19 crc kubenswrapper[5004]: I1203 14:28:19.879324 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.322478 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332087 5004 generic.go:334] "Generic (PLEG): container finished" podID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" exitCode=0 Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332145 5004 generic.go:334] "Generic (PLEG): container finished" podID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" exitCode=2 Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332157 5004 generic.go:334] "Generic (PLEG): container finished" podID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" exitCode=0 Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332204 5004 generic.go:334] "Generic (PLEG): container finished" podID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" exitCode=0 Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332233 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerDied","Data":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332288 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerDied","Data":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332304 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerDied","Data":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332315 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerDied","Data":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332327 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb4fe63a-54e2-4407-aaa5-45b394d0d460","Type":"ContainerDied","Data":"afef6a8786cd175846eb520b7dba9b20ee3cf6234e1b598cbc9bd92724cc8e1a"} Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.332362 5004 scope.go:117] "RemoveContainer" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.392799 5004 scope.go:117] "RemoveContainer" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.422128 5004 scope.go:117] "RemoveContainer" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441128 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441234 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441271 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441293 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441395 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd257\" (UniqueName: \"kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441465 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.441522 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle\") pod \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\" (UID: \"cb4fe63a-54e2-4407-aaa5-45b394d0d460\") " Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.444125 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.444216 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.452245 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts" (OuterVolumeSpecName: "scripts") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.456102 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257" (OuterVolumeSpecName: "kube-api-access-xd257") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "kube-api-access-xd257". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.476163 5004 scope.go:117] "RemoveContainer" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.494216 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.513410 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9twz"] Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.516332 5004 scope.go:117] "RemoveContainer" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: E1203 14:28:20.516985 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": container with ID starting with 5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28 not found: ID does not exist" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.517008 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} err="failed to get container status \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": rpc error: code = NotFound desc = could not find container \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": container with ID starting with 5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.517049 5004 scope.go:117] "RemoveContainer" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: E1203 14:28:20.517890 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": container with ID starting with d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685 not found: ID does not exist" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.517913 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} err="failed to get container status \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": rpc error: code = NotFound desc = could not find container \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": container with ID starting with d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.517928 5004 scope.go:117] "RemoveContainer" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: E1203 14:28:20.518983 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": container with ID starting with 6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5 not found: ID does not exist" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.519084 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} err="failed to get container status \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": rpc error: code = NotFound desc = could not find container \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": container with ID starting with 6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.519101 5004 scope.go:117] "RemoveContainer" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: E1203 14:28:20.521094 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": container with ID starting with ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed not found: ID does not exist" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521135 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} err="failed to get container status \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": rpc error: code = NotFound desc = could not find container \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": container with ID starting with ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521159 5004 scope.go:117] "RemoveContainer" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521657 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} err="failed to get container status \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": rpc error: code = NotFound desc = could not find container \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": container with ID starting with 5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521683 5004 scope.go:117] "RemoveContainer" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521928 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} err="failed to get container status \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": rpc error: code = NotFound desc = could not find container \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": container with ID starting with d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.521952 5004 scope.go:117] "RemoveContainer" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522154 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} err="failed to get container status \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": rpc error: code = NotFound desc = could not find container \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": container with ID starting with 6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522176 5004 scope.go:117] "RemoveContainer" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522375 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} err="failed to get container status \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": rpc error: code = NotFound desc = could not find container \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": container with ID starting with ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522393 5004 scope.go:117] "RemoveContainer" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522614 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} err="failed to get container status \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": rpc error: code = NotFound desc = could not find container \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": container with ID starting with 5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522640 5004 scope.go:117] "RemoveContainer" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522828 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} err="failed to get container status \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": rpc error: code = NotFound desc = could not find container \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": container with ID starting with d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.522926 5004 scope.go:117] "RemoveContainer" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.523463 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} err="failed to get container status \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": rpc error: code = NotFound desc = could not find container \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": container with ID starting with 6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.523553 5004 scope.go:117] "RemoveContainer" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.530888 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} err="failed to get container status \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": rpc error: code = NotFound desc = could not find container \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": container with ID starting with ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.530934 5004 scope.go:117] "RemoveContainer" containerID="5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.531605 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28"} err="failed to get container status \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": rpc error: code = NotFound desc = could not find container \"5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28\": container with ID starting with 5219c721fbb78c55da8c682a7aa491ce57dd5bbb39ee7ba62833e30bfac47d28 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.531635 5004 scope.go:117] "RemoveContainer" containerID="d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.531828 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685"} err="failed to get container status \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": rpc error: code = NotFound desc = could not find container \"d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685\": container with ID starting with d1a3f955b5e7c8116449d61de69f94e9882be353a8b990cf8e087a84b1eaf685 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.531852 5004 scope.go:117] "RemoveContainer" containerID="6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.532100 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5"} err="failed to get container status \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": rpc error: code = NotFound desc = could not find container \"6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5\": container with ID starting with 6647c98dc422063488cbb60166b4a0acf6bd760c7c7fde4bb63fabf47523a0a5 not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.532126 5004 scope.go:117] "RemoveContainer" containerID="ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.533924 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed"} err="failed to get container status \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": rpc error: code = NotFound desc = could not find container \"ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed\": container with ID starting with ebf54380bbf73d405536dfdb5c4e7582d357b1fac5222ff05b5cabce30f194ed not found: ID does not exist" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.545882 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd257\" (UniqueName: \"kubernetes.io/projected/cb4fe63a-54e2-4407-aaa5-45b394d0d460-kube-api-access-xd257\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.546920 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.546935 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.546945 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb4fe63a-54e2-4407-aaa5-45b394d0d460-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.546962 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.585481 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.616023 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data" (OuterVolumeSpecName: "config-data") pod "cb4fe63a-54e2-4407-aaa5-45b394d0d460" (UID: "cb4fe63a-54e2-4407-aaa5-45b394d0d460"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.648542 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.648817 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fe63a-54e2-4407-aaa5-45b394d0d460-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.904031 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.904128 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 14:28:20 crc kubenswrapper[5004]: I1203 14:28:20.937650 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.350229 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9twz" event={"ID":"2193ca31-82a6-4591-ae77-79ffa853b938","Type":"ContainerStarted","Data":"4c687b8c87798ebee30aa1b08d7c286b22ba2e33af7c18277f8ab76fa21d402b"} Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.352583 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.398674 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.412595 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.446237 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:21 crc kubenswrapper[5004]: E1203 14:28:21.447446 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="proxy-httpd" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.447473 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="proxy-httpd" Dec 03 14:28:21 crc kubenswrapper[5004]: E1203 14:28:21.447541 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-central-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.447551 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-central-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: E1203 14:28:21.447582 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-notification-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.447591 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-notification-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: E1203 14:28:21.447622 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="sg-core" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.447630 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="sg-core" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.448135 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="proxy-httpd" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.448172 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-central-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.448217 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="sg-core" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.448248 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" containerName="ceilometer-notification-agent" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.453826 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.459031 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.460066 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.506722 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.571730 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.571781 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.572049 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.572078 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.572099 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.572153 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.572209 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxvn\" (UniqueName: \"kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.627362 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4fe63a-54e2-4407-aaa5-45b394d0d460" path="/var/lib/kubelet/pods/cb4fe63a-54e2-4407-aaa5-45b394d0d460/volumes" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.674846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.675065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.675188 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.675214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.675252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.677424 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.678213 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.675335 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.678770 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxvn\" (UniqueName: \"kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.697576 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.700122 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.701618 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.703380 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.708642 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxvn\" (UniqueName: \"kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn\") pod \"ceilometer-0\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " pod="openstack/ceilometer-0" Dec 03 14:28:21 crc kubenswrapper[5004]: I1203 14:28:21.807804 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:28:22 crc kubenswrapper[5004]: W1203 14:28:22.592376 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde97d15b_33e6_422f_b094_33bdad1d7f87.slice/crio-c97a669c94e0533562e9d6d60417ad1f60aa621e868d7ea94d8984021196b4d1 WatchSource:0}: Error finding container c97a669c94e0533562e9d6d60417ad1f60aa621e868d7ea94d8984021196b4d1: Status 404 returned error can't find the container with id c97a669c94e0533562e9d6d60417ad1f60aa621e868d7ea94d8984021196b4d1 Dec 03 14:28:22 crc kubenswrapper[5004]: I1203 14:28:22.594285 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:23 crc kubenswrapper[5004]: I1203 14:28:23.376371 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerStarted","Data":"c97a669c94e0533562e9d6d60417ad1f60aa621e868d7ea94d8984021196b4d1"} Dec 03 14:28:24 crc kubenswrapper[5004]: I1203 14:28:24.223695 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:28:24 crc kubenswrapper[5004]: I1203 14:28:24.391266 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerStarted","Data":"a9e6a64f9a6abe366139e79765596d46f53c83b150094f475913aa7aeb768a34"} Dec 03 14:28:35 crc kubenswrapper[5004]: I1203 14:28:35.518907 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerStarted","Data":"dd12a274d7c3255220428cfe3d02e7243e892a03ab39db597754ac695b1d13e0"} Dec 03 14:28:35 crc kubenswrapper[5004]: I1203 14:28:35.521996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9twz" event={"ID":"2193ca31-82a6-4591-ae77-79ffa853b938","Type":"ContainerStarted","Data":"6e4f5838998300c39c89d703be54ae1e9136eab8425c56c67f5fca7d96729f25"} Dec 03 14:28:35 crc kubenswrapper[5004]: I1203 14:28:35.542136 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-t9twz" podStartSLOduration=3.899959002 podStartE2EDuration="16.542113656s" podCreationTimestamp="2025-12-03 14:28:19 +0000 UTC" firstStartedPulling="2025-12-03 14:28:20.533935207 +0000 UTC m=+1313.282905443" lastFinishedPulling="2025-12-03 14:28:33.176089861 +0000 UTC m=+1325.925060097" observedRunningTime="2025-12-03 14:28:35.541034386 +0000 UTC m=+1328.290004642" watchObservedRunningTime="2025-12-03 14:28:35.542113656 +0000 UTC m=+1328.291083932" Dec 03 14:28:36 crc kubenswrapper[5004]: I1203 14:28:36.532149 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerStarted","Data":"043a9c8b996743b445b0a9af6af889f3a5c3e2836e18171fffb0b7d7e2d9bf9b"} Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.579999 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerStarted","Data":"236d4caeb3dfbc2679503a12bfb71e07d8dcf714d41d8e395d8f8ba423c4e12d"} Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.580602 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.580232 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="sg-core" containerID="cri-o://043a9c8b996743b445b0a9af6af889f3a5c3e2836e18171fffb0b7d7e2d9bf9b" gracePeriod=30 Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.580249 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-notification-agent" containerID="cri-o://dd12a274d7c3255220428cfe3d02e7243e892a03ab39db597754ac695b1d13e0" gracePeriod=30 Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.580570 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-central-agent" containerID="cri-o://a9e6a64f9a6abe366139e79765596d46f53c83b150094f475913aa7aeb768a34" gracePeriod=30 Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.580197 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="proxy-httpd" containerID="cri-o://236d4caeb3dfbc2679503a12bfb71e07d8dcf714d41d8e395d8f8ba423c4e12d" gracePeriod=30 Dec 03 14:28:38 crc kubenswrapper[5004]: I1203 14:28:38.608358 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.993921343 podStartE2EDuration="17.608333976s" podCreationTimestamp="2025-12-03 14:28:21 +0000 UTC" firstStartedPulling="2025-12-03 14:28:22.596732251 +0000 UTC m=+1315.345702487" lastFinishedPulling="2025-12-03 14:28:38.211144884 +0000 UTC m=+1330.960115120" observedRunningTime="2025-12-03 14:28:38.600459543 +0000 UTC m=+1331.349429779" watchObservedRunningTime="2025-12-03 14:28:38.608333976 +0000 UTC m=+1331.357304212" Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594132 5004 generic.go:334] "Generic (PLEG): container finished" podID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerID="043a9c8b996743b445b0a9af6af889f3a5c3e2836e18171fffb0b7d7e2d9bf9b" exitCode=2 Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594166 5004 generic.go:334] "Generic (PLEG): container finished" podID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerID="dd12a274d7c3255220428cfe3d02e7243e892a03ab39db597754ac695b1d13e0" exitCode=0 Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594176 5004 generic.go:334] "Generic (PLEG): container finished" podID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerID="a9e6a64f9a6abe366139e79765596d46f53c83b150094f475913aa7aeb768a34" exitCode=0 Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594226 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerDied","Data":"043a9c8b996743b445b0a9af6af889f3a5c3e2836e18171fffb0b7d7e2d9bf9b"} Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594305 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerDied","Data":"dd12a274d7c3255220428cfe3d02e7243e892a03ab39db597754ac695b1d13e0"} Dec 03 14:28:39 crc kubenswrapper[5004]: I1203 14:28:39.594318 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerDied","Data":"a9e6a64f9a6abe366139e79765596d46f53c83b150094f475913aa7aeb768a34"} Dec 03 14:28:51 crc kubenswrapper[5004]: I1203 14:28:51.814199 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 14:28:55 crc kubenswrapper[5004]: I1203 14:28:55.746756 5004 generic.go:334] "Generic (PLEG): container finished" podID="2193ca31-82a6-4591-ae77-79ffa853b938" containerID="6e4f5838998300c39c89d703be54ae1e9136eab8425c56c67f5fca7d96729f25" exitCode=0 Dec 03 14:28:55 crc kubenswrapper[5004]: I1203 14:28:55.746883 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9twz" event={"ID":"2193ca31-82a6-4591-ae77-79ffa853b938","Type":"ContainerDied","Data":"6e4f5838998300c39c89d703be54ae1e9136eab8425c56c67f5fca7d96729f25"} Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.152352 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.210503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data\") pod \"2193ca31-82a6-4591-ae77-79ffa853b938\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.210727 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts\") pod \"2193ca31-82a6-4591-ae77-79ffa853b938\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.210792 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjd6g\" (UniqueName: \"kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g\") pod \"2193ca31-82a6-4591-ae77-79ffa853b938\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.210829 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle\") pod \"2193ca31-82a6-4591-ae77-79ffa853b938\" (UID: \"2193ca31-82a6-4591-ae77-79ffa853b938\") " Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.216934 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts" (OuterVolumeSpecName: "scripts") pod "2193ca31-82a6-4591-ae77-79ffa853b938" (UID: "2193ca31-82a6-4591-ae77-79ffa853b938"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.216942 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g" (OuterVolumeSpecName: "kube-api-access-cjd6g") pod "2193ca31-82a6-4591-ae77-79ffa853b938" (UID: "2193ca31-82a6-4591-ae77-79ffa853b938"). InnerVolumeSpecName "kube-api-access-cjd6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.239120 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2193ca31-82a6-4591-ae77-79ffa853b938" (UID: "2193ca31-82a6-4591-ae77-79ffa853b938"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.242119 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data" (OuterVolumeSpecName: "config-data") pod "2193ca31-82a6-4591-ae77-79ffa853b938" (UID: "2193ca31-82a6-4591-ae77-79ffa853b938"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.312806 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.312839 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.312850 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjd6g\" (UniqueName: \"kubernetes.io/projected/2193ca31-82a6-4591-ae77-79ffa853b938-kube-api-access-cjd6g\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.312887 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2193ca31-82a6-4591-ae77-79ffa853b938-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:28:57 crc kubenswrapper[5004]: E1203 14:28:57.771814 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2193ca31_82a6_4591_ae77_79ffa853b938.slice/crio-4c687b8c87798ebee30aa1b08d7c286b22ba2e33af7c18277f8ab76fa21d402b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2193ca31_82a6_4591_ae77_79ffa853b938.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.780192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9twz" event={"ID":"2193ca31-82a6-4591-ae77-79ffa853b938","Type":"ContainerDied","Data":"4c687b8c87798ebee30aa1b08d7c286b22ba2e33af7c18277f8ab76fa21d402b"} Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.780246 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c687b8c87798ebee30aa1b08d7c286b22ba2e33af7c18277f8ab76fa21d402b" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.780322 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9twz" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.903039 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:28:57 crc kubenswrapper[5004]: E1203 14:28:57.903685 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2193ca31-82a6-4591-ae77-79ffa853b938" containerName="nova-cell0-conductor-db-sync" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.903698 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2193ca31-82a6-4591-ae77-79ffa853b938" containerName="nova-cell0-conductor-db-sync" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.903911 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2193ca31-82a6-4591-ae77-79ffa853b938" containerName="nova-cell0-conductor-db-sync" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.904596 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.907804 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kr7qd" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.908236 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 14:28:57 crc kubenswrapper[5004]: I1203 14:28:57.917599 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.036831 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.036943 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.036986 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2sp\" (UniqueName: \"kubernetes.io/projected/d73689af-1cf5-4846-8e52-c34bce039ca7-kube-api-access-pz2sp\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.138971 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.139042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.139091 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2sp\" (UniqueName: \"kubernetes.io/projected/d73689af-1cf5-4846-8e52-c34bce039ca7-kube-api-access-pz2sp\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.143728 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.146962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73689af-1cf5-4846-8e52-c34bce039ca7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.167723 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2sp\" (UniqueName: \"kubernetes.io/projected/d73689af-1cf5-4846-8e52-c34bce039ca7-kube-api-access-pz2sp\") pod \"nova-cell0-conductor-0\" (UID: \"d73689af-1cf5-4846-8e52-c34bce039ca7\") " pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.236464 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.712837 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 14:28:58 crc kubenswrapper[5004]: I1203 14:28:58.805012 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d73689af-1cf5-4846-8e52-c34bce039ca7","Type":"ContainerStarted","Data":"13d7b8c6172c228a7ab43b4eb60772996354f1037ec9d541ace8f03b645bba98"} Dec 03 14:28:59 crc kubenswrapper[5004]: I1203 14:28:59.826374 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d73689af-1cf5-4846-8e52-c34bce039ca7","Type":"ContainerStarted","Data":"d1647934ab60b6eb2b367cf8f579f1241db49ccf2bc5f2e028e9263f23da68d7"} Dec 03 14:28:59 crc kubenswrapper[5004]: I1203 14:28:59.826978 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 14:28:59 crc kubenswrapper[5004]: I1203 14:28:59.850275 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.850257643 podStartE2EDuration="2.850257643s" podCreationTimestamp="2025-12-03 14:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:28:59.847349611 +0000 UTC m=+1352.596319847" watchObservedRunningTime="2025-12-03 14:28:59.850257643 +0000 UTC m=+1352.599227879" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.270851 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.721617 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-79vfv"] Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.723167 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.727691 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.727939 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.738345 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-79vfv"] Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.866103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.866184 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.866313 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.866391 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprqs\" (UniqueName: \"kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.969088 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.973016 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.984192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprqs\" (UniqueName: \"kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.984281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.984323 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.984369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.984771 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:29:03 crc kubenswrapper[5004]: I1203 14:29:03.995663 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.001578 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.007507 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.014905 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprqs\" (UniqueName: \"kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.018356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data\") pod \"nova-cell0-cell-mapping-79vfv\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.048233 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.091768 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcjk8\" (UniqueName: \"kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.091948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.092157 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.093197 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.114435 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.115929 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.125153 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.135976 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.142451 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.144282 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.163756 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.170178 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.183923 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.185147 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.194205 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195625 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcjk8\" (UniqueName: \"kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195670 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195707 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195804 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.195872 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s466k\" (UniqueName: \"kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.201735 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.208288 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.232152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.241025 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.242809 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.265533 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcjk8\" (UniqueName: \"kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8\") pod \"nova-api-0\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.276095 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.295961 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.297675 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299334 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299378 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s466k\" (UniqueName: \"kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299448 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299475 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299499 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827dq\" (UniqueName: \"kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299574 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299597 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299660 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ndtl\" (UniqueName: \"kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.299710 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.301250 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.306909 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.316190 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.341532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s466k\" (UniqueName: \"kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k\") pod \"nova-metadata-0\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403361 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403486 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403550 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-827dq\" (UniqueName: \"kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403603 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403668 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403703 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403745 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ndtl\" (UniqueName: \"kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403793 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403820 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403946 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.403977 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsm6\" (UniqueName: \"kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.409784 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.412600 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.415490 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.415698 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.436429 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-827dq\" (UniqueName: \"kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq\") pod \"nova-cell1-novncproxy-0\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.440424 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ndtl\" (UniqueName: \"kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl\") pod \"nova-scheduler-0\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.506390 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.506793 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsm6\" (UniqueName: \"kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.506840 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.506906 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.507019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.507060 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.508972 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.509164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.509267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.509329 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.509561 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.527676 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsm6\" (UniqueName: \"kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6\") pod \"dnsmasq-dns-757b4f8459-k2lqp\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.619002 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.649715 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.665280 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.680033 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.737834 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-79vfv"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.926832 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wb76m"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.930299 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.933069 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.934325 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.941411 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.943926 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-79vfv" event={"ID":"b52e9683-d017-4a09-a6fa-5377df5032e1","Type":"ContainerStarted","Data":"8a62863d9e3387034eacbcd9f83d8466f6214013db53f06acfd55c242dd4bf68"} Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.948578 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:29:04 crc kubenswrapper[5004]: I1203 14:29:04.957593 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wb76m"] Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.028795 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.029108 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.029126 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.029230 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btn8c\" (UniqueName: \"kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.135939 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.136001 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.136024 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.136145 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btn8c\" (UniqueName: \"kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.144426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.198387 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btn8c\" (UniqueName: \"kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.243406 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.244163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data\") pod \"nova-cell1-conductor-db-sync-wb76m\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.271317 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.293711 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:05 crc kubenswrapper[5004]: W1203 14:29:05.351694 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42378b8_2bd2_4c14_8f3d_d55f624536f9.slice/crio-d8abb196d69f03f1070002d7d7a842f7b0fa3a17234c44f8937106f984111f9f WatchSource:0}: Error finding container d8abb196d69f03f1070002d7d7a842f7b0fa3a17234c44f8937106f984111f9f: Status 404 returned error can't find the container with id d8abb196d69f03f1070002d7d7a842f7b0fa3a17234c44f8937106f984111f9f Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.470299 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:05 crc kubenswrapper[5004]: W1203 14:29:05.528413 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ad616f1_74f7_4f52_b86b_3a10abd23a10.slice/crio-877eb5faf48a200419e9ce4d5c94762e4584122e2c67544639d3ff36fe1b1ec0 WatchSource:0}: Error finding container 877eb5faf48a200419e9ce4d5c94762e4584122e2c67544639d3ff36fe1b1ec0: Status 404 returned error can't find the container with id 877eb5faf48a200419e9ce4d5c94762e4584122e2c67544639d3ff36fe1b1ec0 Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.586589 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:05 crc kubenswrapper[5004]: W1203 14:29:05.611672 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod599482f5_af51_4a34_abc3_97ce21f8b6dd.slice/crio-8cd39621f2bd38ef8a47bb781eeb3c0559aa122db21350f67c1e56f0df7cb672 WatchSource:0}: Error finding container 8cd39621f2bd38ef8a47bb781eeb3c0559aa122db21350f67c1e56f0df7cb672: Status 404 returned error can't find the container with id 8cd39621f2bd38ef8a47bb781eeb3c0559aa122db21350f67c1e56f0df7cb672 Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.647086 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:05 crc kubenswrapper[5004]: W1203 14:29:05.701819 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod291b528c_e4a4_4e8b_b88c_7db763b01f37.slice/crio-ee629e0da718e8c1c36d60688b1df34f03aaf83b0f6adb371a122edf0b7b37f1 WatchSource:0}: Error finding container ee629e0da718e8c1c36d60688b1df34f03aaf83b0f6adb371a122edf0b7b37f1: Status 404 returned error can't find the container with id ee629e0da718e8c1c36d60688b1df34f03aaf83b0f6adb371a122edf0b7b37f1 Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.958109 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" event={"ID":"291b528c-e4a4-4e8b-b88c-7db763b01f37","Type":"ContainerStarted","Data":"ee629e0da718e8c1c36d60688b1df34f03aaf83b0f6adb371a122edf0b7b37f1"} Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.960471 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-79vfv" event={"ID":"b52e9683-d017-4a09-a6fa-5377df5032e1","Type":"ContainerStarted","Data":"3075704f48da98497ea8b79f58d0f79d33b25f83b5b44cf2d895d489b003c3ac"} Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.976296 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wb76m"] Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.979276 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerStarted","Data":"19595152d08375d51cea88867cd8ae7f6bf3923b89c62d6b93023d9125443748"} Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.989084 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-79vfv" podStartSLOduration=2.989060672 podStartE2EDuration="2.989060672s" podCreationTimestamp="2025-12-03 14:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:05.985243154 +0000 UTC m=+1358.734213400" watchObservedRunningTime="2025-12-03 14:29:05.989060672 +0000 UTC m=+1358.738030918" Dec 03 14:29:05 crc kubenswrapper[5004]: I1203 14:29:05.990332 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerStarted","Data":"d8abb196d69f03f1070002d7d7a842f7b0fa3a17234c44f8937106f984111f9f"} Dec 03 14:29:06 crc kubenswrapper[5004]: I1203 14:29:06.000531 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ad616f1-74f7-4f52-b86b-3a10abd23a10","Type":"ContainerStarted","Data":"877eb5faf48a200419e9ce4d5c94762e4584122e2c67544639d3ff36fe1b1ec0"} Dec 03 14:29:06 crc kubenswrapper[5004]: I1203 14:29:06.011099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"599482f5-af51-4a34-abc3-97ce21f8b6dd","Type":"ContainerStarted","Data":"8cd39621f2bd38ef8a47bb781eeb3c0559aa122db21350f67c1e56f0df7cb672"} Dec 03 14:29:07 crc kubenswrapper[5004]: I1203 14:29:07.028368 5004 generic.go:334] "Generic (PLEG): container finished" podID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerID="8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf" exitCode=0 Dec 03 14:29:07 crc kubenswrapper[5004]: I1203 14:29:07.029320 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" event={"ID":"291b528c-e4a4-4e8b-b88c-7db763b01f37","Type":"ContainerDied","Data":"8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf"} Dec 03 14:29:07 crc kubenswrapper[5004]: I1203 14:29:07.035592 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wb76m" event={"ID":"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf","Type":"ContainerStarted","Data":"390d9d79e25c674bdb7459b434605e925e68c625da875e49f3881fc6b76d4322"} Dec 03 14:29:07 crc kubenswrapper[5004]: I1203 14:29:07.035643 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wb76m" event={"ID":"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf","Type":"ContainerStarted","Data":"55cc8f00530bbee7ab8ffd334406981de331e7ea540e20e52b6a6983476fa011"} Dec 03 14:29:07 crc kubenswrapper[5004]: I1203 14:29:07.078367 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wb76m" podStartSLOduration=3.07834611 podStartE2EDuration="3.07834611s" podCreationTimestamp="2025-12-03 14:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:07.074318886 +0000 UTC m=+1359.823289122" watchObservedRunningTime="2025-12-03 14:29:07.07834611 +0000 UTC m=+1359.827316346" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.086060 5004 generic.go:334] "Generic (PLEG): container finished" podID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerID="236d4caeb3dfbc2679503a12bfb71e07d8dcf714d41d8e395d8f8ba423c4e12d" exitCode=137 Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.086216 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerDied","Data":"236d4caeb3dfbc2679503a12bfb71e07d8dcf714d41d8e395d8f8ba423c4e12d"} Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.278389 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.287597 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.305914 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460531 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460623 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqxvn\" (UniqueName: \"kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460683 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460789 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460807 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460883 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.460910 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd\") pod \"de97d15b-33e6-422f-b094-33bdad1d7f87\" (UID: \"de97d15b-33e6-422f-b094-33bdad1d7f87\") " Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.461376 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.461625 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.468601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn" (OuterVolumeSpecName: "kube-api-access-cqxvn") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "kube-api-access-cqxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.471947 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts" (OuterVolumeSpecName: "scripts") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.504555 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.563039 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqxvn\" (UniqueName: \"kubernetes.io/projected/de97d15b-33e6-422f-b094-33bdad1d7f87-kube-api-access-cqxvn\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.563348 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.563357 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.563365 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.563374 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de97d15b-33e6-422f-b094-33bdad1d7f87-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.652068 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.659494 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data" (OuterVolumeSpecName: "config-data") pod "de97d15b-33e6-422f-b094-33bdad1d7f87" (UID: "de97d15b-33e6-422f-b094-33bdad1d7f87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.666145 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:09 crc kubenswrapper[5004]: I1203 14:29:09.666174 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de97d15b-33e6-422f-b094-33bdad1d7f87-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.097634 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerStarted","Data":"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.097680 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerStarted","Data":"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.097722 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-metadata" containerID="cri-o://f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" gracePeriod=30 Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.097722 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-log" containerID="cri-o://8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" gracePeriod=30 Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.100820 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ad616f1-74f7-4f52-b86b-3a10abd23a10","Type":"ContainerStarted","Data":"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.102873 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"599482f5-af51-4a34-abc3-97ce21f8b6dd","Type":"ContainerStarted","Data":"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.102941 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="599482f5-af51-4a34-abc3-97ce21f8b6dd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75" gracePeriod=30 Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.106022 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" event={"ID":"291b528c-e4a4-4e8b-b88c-7db763b01f37","Type":"ContainerStarted","Data":"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.106747 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.108635 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerStarted","Data":"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.108674 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerStarted","Data":"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.111646 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de97d15b-33e6-422f-b094-33bdad1d7f87","Type":"ContainerDied","Data":"c97a669c94e0533562e9d6d60417ad1f60aa621e868d7ea94d8984021196b4d1"} Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.111693 5004 scope.go:117] "RemoveContainer" containerID="236d4caeb3dfbc2679503a12bfb71e07d8dcf714d41d8e395d8f8ba423c4e12d" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.111832 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.129600 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.563735832 podStartE2EDuration="6.129577765s" podCreationTimestamp="2025-12-03 14:29:04 +0000 UTC" firstStartedPulling="2025-12-03 14:29:05.359192509 +0000 UTC m=+1358.108162745" lastFinishedPulling="2025-12-03 14:29:08.925034442 +0000 UTC m=+1361.674004678" observedRunningTime="2025-12-03 14:29:10.114703953 +0000 UTC m=+1362.863674249" watchObservedRunningTime="2025-12-03 14:29:10.129577765 +0000 UTC m=+1362.878548011" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.142193 5004 scope.go:117] "RemoveContainer" containerID="043a9c8b996743b445b0a9af6af889f3a5c3e2836e18171fffb0b7d7e2d9bf9b" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.152038 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.17470216 podStartE2EDuration="7.15201877s" podCreationTimestamp="2025-12-03 14:29:03 +0000 UTC" firstStartedPulling="2025-12-03 14:29:04.947949649 +0000 UTC m=+1357.696919885" lastFinishedPulling="2025-12-03 14:29:08.925266259 +0000 UTC m=+1361.674236495" observedRunningTime="2025-12-03 14:29:10.149295033 +0000 UTC m=+1362.898265279" watchObservedRunningTime="2025-12-03 14:29:10.15201877 +0000 UTC m=+1362.900989006" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.161585 5004 scope.go:117] "RemoveContainer" containerID="dd12a274d7c3255220428cfe3d02e7243e892a03ab39db597754ac695b1d13e0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.175998 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" podStartSLOduration=6.175979979 podStartE2EDuration="6.175979979s" podCreationTimestamp="2025-12-03 14:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:10.171290536 +0000 UTC m=+1362.920260802" watchObservedRunningTime="2025-12-03 14:29:10.175979979 +0000 UTC m=+1362.924950215" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.183811 5004 scope.go:117] "RemoveContainer" containerID="a9e6a64f9a6abe366139e79765596d46f53c83b150094f475913aa7aeb768a34" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.192002 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.891675802 podStartE2EDuration="6.191982973s" podCreationTimestamp="2025-12-03 14:29:04 +0000 UTC" firstStartedPulling="2025-12-03 14:29:05.628221 +0000 UTC m=+1358.377191236" lastFinishedPulling="2025-12-03 14:29:08.928528171 +0000 UTC m=+1361.677498407" observedRunningTime="2025-12-03 14:29:10.186623811 +0000 UTC m=+1362.935594067" watchObservedRunningTime="2025-12-03 14:29:10.191982973 +0000 UTC m=+1362.940953219" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.223268 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.82984947 podStartE2EDuration="6.223249028s" podCreationTimestamp="2025-12-03 14:29:04 +0000 UTC" firstStartedPulling="2025-12-03 14:29:05.531735047 +0000 UTC m=+1358.280705283" lastFinishedPulling="2025-12-03 14:29:08.925134605 +0000 UTC m=+1361.674104841" observedRunningTime="2025-12-03 14:29:10.208438789 +0000 UTC m=+1362.957409035" watchObservedRunningTime="2025-12-03 14:29:10.223249028 +0000 UTC m=+1362.972219264" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.234729 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.250467 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.269387 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:10 crc kubenswrapper[5004]: E1203 14:29:10.269905 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="sg-core" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270053 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="sg-core" Dec 03 14:29:10 crc kubenswrapper[5004]: E1203 14:29:10.270146 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-notification-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270206 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-notification-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: E1203 14:29:10.270270 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="proxy-httpd" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270327 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="proxy-httpd" Dec 03 14:29:10 crc kubenswrapper[5004]: E1203 14:29:10.270390 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-central-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270450 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-central-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270738 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-notification-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270818 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="ceilometer-central-agent" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.270906 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="proxy-httpd" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.271007 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" containerName="sg-core" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.272831 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.279476 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.281074 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.281458 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379478 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379646 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xk8\" (UniqueName: \"kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379690 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379724 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379754 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379880 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.379933 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.482981 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xk8\" (UniqueName: \"kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483337 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483390 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.483568 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.484083 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.490315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.493517 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.493587 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.506331 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.507043 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.515487 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xk8\" (UniqueName: \"kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8\") pod \"ceilometer-0\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.620737 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.690715 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s466k\" (UniqueName: \"kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k\") pod \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.690873 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle\") pod \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.690909 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs\") pod \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.691101 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data\") pod \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\" (UID: \"e42378b8-2bd2-4c14-8f3d-d55f624536f9\") " Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.692480 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs" (OuterVolumeSpecName: "logs") pod "e42378b8-2bd2-4c14-8f3d-d55f624536f9" (UID: "e42378b8-2bd2-4c14-8f3d-d55f624536f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.695317 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.697525 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42378b8-2bd2-4c14-8f3d-d55f624536f9-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.700828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k" (OuterVolumeSpecName: "kube-api-access-s466k") pod "e42378b8-2bd2-4c14-8f3d-d55f624536f9" (UID: "e42378b8-2bd2-4c14-8f3d-d55f624536f9"). InnerVolumeSpecName "kube-api-access-s466k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.731006 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data" (OuterVolumeSpecName: "config-data") pod "e42378b8-2bd2-4c14-8f3d-d55f624536f9" (UID: "e42378b8-2bd2-4c14-8f3d-d55f624536f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.733206 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42378b8-2bd2-4c14-8f3d-d55f624536f9" (UID: "e42378b8-2bd2-4c14-8f3d-d55f624536f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.799631 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.799874 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s466k\" (UniqueName: \"kubernetes.io/projected/e42378b8-2bd2-4c14-8f3d-d55f624536f9-kube-api-access-s466k\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:10 crc kubenswrapper[5004]: I1203 14:29:10.799887 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42378b8-2bd2-4c14-8f3d-d55f624536f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.009939 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.124314 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerStarted","Data":"b65035c9f81aa806069851a2861bfe256366e30236aa99cb00eb5f728cb153c0"} Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.127365 5004 generic.go:334] "Generic (PLEG): container finished" podID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerID="f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" exitCode=0 Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.127390 5004 generic.go:334] "Generic (PLEG): container finished" podID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerID="8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" exitCode=143 Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.128194 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.130490 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerDied","Data":"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252"} Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.130545 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerDied","Data":"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d"} Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.130558 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e42378b8-2bd2-4c14-8f3d-d55f624536f9","Type":"ContainerDied","Data":"d8abb196d69f03f1070002d7d7a842f7b0fa3a17234c44f8937106f984111f9f"} Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.130578 5004 scope.go:117] "RemoveContainer" containerID="f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.175948 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.177217 5004 scope.go:117] "RemoveContainer" containerID="8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.184852 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.209539 5004 scope.go:117] "RemoveContainer" containerID="f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.214026 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:11 crc kubenswrapper[5004]: E1203 14:29:11.214505 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-metadata" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.214523 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-metadata" Dec 03 14:29:11 crc kubenswrapper[5004]: E1203 14:29:11.214558 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-log" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.214564 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-log" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.214801 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-metadata" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.214839 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" containerName="nova-metadata-log" Dec 03 14:29:11 crc kubenswrapper[5004]: E1203 14:29:11.214979 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252\": container with ID starting with f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252 not found: ID does not exist" containerID="f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.215080 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252"} err="failed to get container status \"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252\": rpc error: code = NotFound desc = could not find container \"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252\": container with ID starting with f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252 not found: ID does not exist" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.215163 5004 scope.go:117] "RemoveContainer" containerID="8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.216247 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: E1203 14:29:11.218355 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d\": container with ID starting with 8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d not found: ID does not exist" containerID="8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.218410 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d"} err="failed to get container status \"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d\": rpc error: code = NotFound desc = could not find container \"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d\": container with ID starting with 8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d not found: ID does not exist" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.218436 5004 scope.go:117] "RemoveContainer" containerID="f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.225210 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.227848 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252"} err="failed to get container status \"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252\": rpc error: code = NotFound desc = could not find container \"f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252\": container with ID starting with f0f98b0464fb8315c3546359d9ba1100f8b6ca878eaed8939334710506367252 not found: ID does not exist" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.227918 5004 scope.go:117] "RemoveContainer" containerID="8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.228670 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.229639 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.231557 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d"} err="failed to get container status \"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d\": rpc error: code = NotFound desc = could not find container \"8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d\": container with ID starting with 8a4777ad906ed925ef22a568851e5da208fe22651731e760c28a0bb4ac9e412d not found: ID does not exist" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.313215 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.313282 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.313315 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7rs\" (UniqueName: \"kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.313340 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.313374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.415301 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.415417 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.415514 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7rs\" (UniqueName: \"kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.415559 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.416075 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.421313 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.423968 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.424609 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.430940 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.453594 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7rs\" (UniqueName: \"kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs\") pod \"nova-metadata-0\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.543683 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.624621 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de97d15b-33e6-422f-b094-33bdad1d7f87" path="/var/lib/kubelet/pods/de97d15b-33e6-422f-b094-33bdad1d7f87/volumes" Dec 03 14:29:11 crc kubenswrapper[5004]: I1203 14:29:11.627322 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42378b8-2bd2-4c14-8f3d-d55f624536f9" path="/var/lib/kubelet/pods/e42378b8-2bd2-4c14-8f3d-d55f624536f9/volumes" Dec 03 14:29:12 crc kubenswrapper[5004]: I1203 14:29:12.009192 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:12 crc kubenswrapper[5004]: I1203 14:29:12.139476 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerStarted","Data":"619455ad30f20598269bfdf0c2343f8a8c9f84cef308d8e62368979590baf6b8"} Dec 03 14:29:12 crc kubenswrapper[5004]: I1203 14:29:12.142127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerStarted","Data":"d9c850007896cdecf9b4c1dd9cba5aff97d51dfd677c7671274f8c31cd082f14"} Dec 03 14:29:13 crc kubenswrapper[5004]: I1203 14:29:13.157517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerStarted","Data":"2f2f6431eb0362b1cb23c5cfa682ee7d8833ffa72440e0f16237398c5045092f"} Dec 03 14:29:13 crc kubenswrapper[5004]: I1203 14:29:13.161187 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerStarted","Data":"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea"} Dec 03 14:29:13 crc kubenswrapper[5004]: I1203 14:29:13.161226 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerStarted","Data":"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e"} Dec 03 14:29:13 crc kubenswrapper[5004]: I1203 14:29:13.184940 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.184919586 podStartE2EDuration="2.184919586s" podCreationTimestamp="2025-12-03 14:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:13.181639263 +0000 UTC m=+1365.930609499" watchObservedRunningTime="2025-12-03 14:29:13.184919586 +0000 UTC m=+1365.933889832" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.172666 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerStarted","Data":"71b052ec96176853303050a6281debef555d5753e0e2354438e197b3b0d0ddfc"} Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.298262 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.298321 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.650962 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.651325 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.666394 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.683031 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.692971 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.748202 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:29:14 crc kubenswrapper[5004]: I1203 14:29:14.748466 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="dnsmasq-dns" containerID="cri-o://79694b788d43d70467df161240babad1bc3346b1861cae5356991f5700a97b91" gracePeriod=10 Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.183425 5004 generic.go:334] "Generic (PLEG): container finished" podID="d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" containerID="390d9d79e25c674bdb7459b434605e925e68c625da875e49f3881fc6b76d4322" exitCode=0 Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.183529 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wb76m" event={"ID":"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf","Type":"ContainerDied","Data":"390d9d79e25c674bdb7459b434605e925e68c625da875e49f3881fc6b76d4322"} Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.187318 5004 generic.go:334] "Generic (PLEG): container finished" podID="b52e9683-d017-4a09-a6fa-5377df5032e1" containerID="3075704f48da98497ea8b79f58d0f79d33b25f83b5b44cf2d895d489b003c3ac" exitCode=0 Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.187399 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-79vfv" event={"ID":"b52e9683-d017-4a09-a6fa-5377df5032e1","Type":"ContainerDied","Data":"3075704f48da98497ea8b79f58d0f79d33b25f83b5b44cf2d895d489b003c3ac"} Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.190785 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerStarted","Data":"3383b83f8cdf1f3577cb75034c5419f8764fb381da9ff2f10f5ad0a047d3e071"} Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.191214 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.195080 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" event={"ID":"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0","Type":"ContainerDied","Data":"79694b788d43d70467df161240babad1bc3346b1861cae5356991f5700a97b91"} Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.195047 5004 generic.go:334] "Generic (PLEG): container finished" podID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerID="79694b788d43d70467df161240babad1bc3346b1861cae5356991f5700a97b91" exitCode=0 Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.227617 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6919334830000001 podStartE2EDuration="5.227591861s" podCreationTimestamp="2025-12-03 14:29:10 +0000 UTC" firstStartedPulling="2025-12-03 14:29:11.040601472 +0000 UTC m=+1363.789571708" lastFinishedPulling="2025-12-03 14:29:14.57625985 +0000 UTC m=+1367.325230086" observedRunningTime="2025-12-03 14:29:15.218818313 +0000 UTC m=+1367.967788549" watchObservedRunningTime="2025-12-03 14:29:15.227591861 +0000 UTC m=+1367.976562097" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.239940 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.339346 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.381802 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.381832 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.430752 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fz8r\" (UniqueName: \"kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.431226 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.431276 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.431305 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.431334 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.431374 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb\") pod \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\" (UID: \"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0\") " Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.438217 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r" (OuterVolumeSpecName: "kube-api-access-4fz8r") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "kube-api-access-4fz8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.533714 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.534299 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fz8r\" (UniqueName: \"kubernetes.io/projected/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-kube-api-access-4fz8r\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.534337 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.537270 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.552286 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config" (OuterVolumeSpecName: "config") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.556285 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.564962 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" (UID: "9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.636545 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.636576 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.636586 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:15 crc kubenswrapper[5004]: I1203 14:29:15.636595 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.206437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" event={"ID":"9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0","Type":"ContainerDied","Data":"ee17711545a86ae1bdc5ea53dd770b67914850ab9618085b8d8d6913d5018ab5"} Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.206512 5004 scope.go:117] "RemoveContainer" containerID="79694b788d43d70467df161240babad1bc3346b1861cae5356991f5700a97b91" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.206538 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-z9mhl" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.237376 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.248021 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-z9mhl"] Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.261065 5004 scope.go:117] "RemoveContainer" containerID="fcdb4e59475e84328c24a47140d1c6de9d342714466115dee35d85a302fd3eab" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.544587 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.545000 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.771254 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.778457 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.861796 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle\") pod \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.861908 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mprqs\" (UniqueName: \"kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs\") pod \"b52e9683-d017-4a09-a6fa-5377df5032e1\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.861991 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data\") pod \"b52e9683-d017-4a09-a6fa-5377df5032e1\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.862034 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts\") pod \"b52e9683-d017-4a09-a6fa-5377df5032e1\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.862114 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle\") pod \"b52e9683-d017-4a09-a6fa-5377df5032e1\" (UID: \"b52e9683-d017-4a09-a6fa-5377df5032e1\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.862143 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data\") pod \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.862177 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btn8c\" (UniqueName: \"kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c\") pod \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.862209 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts\") pod \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\" (UID: \"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf\") " Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.870739 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs" (OuterVolumeSpecName: "kube-api-access-mprqs") pod "b52e9683-d017-4a09-a6fa-5377df5032e1" (UID: "b52e9683-d017-4a09-a6fa-5377df5032e1"). InnerVolumeSpecName "kube-api-access-mprqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.876156 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts" (OuterVolumeSpecName: "scripts") pod "b52e9683-d017-4a09-a6fa-5377df5032e1" (UID: "b52e9683-d017-4a09-a6fa-5377df5032e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.878100 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts" (OuterVolumeSpecName: "scripts") pod "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" (UID: "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.878283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c" (OuterVolumeSpecName: "kube-api-access-btn8c") pod "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" (UID: "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf"). InnerVolumeSpecName "kube-api-access-btn8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.903462 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" (UID: "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.906387 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data" (OuterVolumeSpecName: "config-data") pod "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" (UID: "d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.917838 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data" (OuterVolumeSpecName: "config-data") pod "b52e9683-d017-4a09-a6fa-5377df5032e1" (UID: "b52e9683-d017-4a09-a6fa-5377df5032e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.925378 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b52e9683-d017-4a09-a6fa-5377df5032e1" (UID: "b52e9683-d017-4a09-a6fa-5377df5032e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.965542 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966238 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966252 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btn8c\" (UniqueName: \"kubernetes.io/projected/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-kube-api-access-btn8c\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966263 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966274 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966285 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mprqs\" (UniqueName: \"kubernetes.io/projected/b52e9683-d017-4a09-a6fa-5377df5032e1-kube-api-access-mprqs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966297 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:16 crc kubenswrapper[5004]: I1203 14:29:16.966307 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b52e9683-d017-4a09-a6fa-5377df5032e1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.223704 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wb76m" event={"ID":"d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf","Type":"ContainerDied","Data":"55cc8f00530bbee7ab8ffd334406981de331e7ea540e20e52b6a6983476fa011"} Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.223757 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55cc8f00530bbee7ab8ffd334406981de331e7ea540e20e52b6a6983476fa011" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.223718 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wb76m" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.225230 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-79vfv" event={"ID":"b52e9683-d017-4a09-a6fa-5377df5032e1","Type":"ContainerDied","Data":"8a62863d9e3387034eacbcd9f83d8466f6214013db53f06acfd55c242dd4bf68"} Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.225252 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a62863d9e3387034eacbcd9f83d8466f6214013db53f06acfd55c242dd4bf68" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.225294 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-79vfv" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.309807 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:29:17 crc kubenswrapper[5004]: E1203 14:29:17.310257 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="dnsmasq-dns" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310274 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="dnsmasq-dns" Dec 03 14:29:17 crc kubenswrapper[5004]: E1203 14:29:17.310292 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="init" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310300 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="init" Dec 03 14:29:17 crc kubenswrapper[5004]: E1203 14:29:17.310315 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52e9683-d017-4a09-a6fa-5377df5032e1" containerName="nova-manage" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310323 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52e9683-d017-4a09-a6fa-5377df5032e1" containerName="nova-manage" Dec 03 14:29:17 crc kubenswrapper[5004]: E1203 14:29:17.310346 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" containerName="nova-cell1-conductor-db-sync" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310351 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" containerName="nova-cell1-conductor-db-sync" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310512 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" containerName="dnsmasq-dns" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310527 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52e9683-d017-4a09-a6fa-5377df5032e1" containerName="nova-manage" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.310541 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" containerName="nova-cell1-conductor-db-sync" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.311165 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.316444 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.335230 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.375188 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbk4p\" (UniqueName: \"kubernetes.io/projected/234c4127-5836-4628-a426-2644c4df71a1-kube-api-access-tbk4p\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.375320 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.391997 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.493568 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.493653 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbk4p\" (UniqueName: \"kubernetes.io/projected/234c4127-5836-4628-a426-2644c4df71a1-kube-api-access-tbk4p\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.493725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.498050 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.498267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/234c4127-5836-4628-a426-2644c4df71a1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.513921 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbk4p\" (UniqueName: \"kubernetes.io/projected/234c4127-5836-4628-a426-2644c4df71a1-kube-api-access-tbk4p\") pod \"nova-cell1-conductor-0\" (UID: \"234c4127-5836-4628-a426-2644c4df71a1\") " pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.534993 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.535476 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-api" containerID="cri-o://0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c" gracePeriod=30 Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.535309 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-log" containerID="cri-o://ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4" gracePeriod=30 Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.545625 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.545824 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerName="nova-scheduler-scheduler" containerID="cri-o://ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" gracePeriod=30 Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.594613 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.594920 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-log" containerID="cri-o://96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" gracePeriod=30 Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.595039 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-metadata" containerID="cri-o://ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" gracePeriod=30 Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.622320 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0" path="/var/lib/kubelet/pods/9d27d2d8-33ae-4e15-bd5f-ab671d40bfc0/volumes" Dec 03 14:29:17 crc kubenswrapper[5004]: I1203 14:29:17.637749 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.112262 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 14:29:18 crc kubenswrapper[5004]: W1203 14:29:18.113433 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod234c4127_5836_4628_a426_2644c4df71a1.slice/crio-d50f3111ddd54fa0ca44fb70a8571ce09939837af8a2354dd8c20c0de8354515 WatchSource:0}: Error finding container d50f3111ddd54fa0ca44fb70a8571ce09939837af8a2354dd8c20c0de8354515: Status 404 returned error can't find the container with id d50f3111ddd54fa0ca44fb70a8571ce09939837af8a2354dd8c20c0de8354515 Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.192842 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.255005 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"234c4127-5836-4628-a426-2644c4df71a1","Type":"ContainerStarted","Data":"d50f3111ddd54fa0ca44fb70a8571ce09939837af8a2354dd8c20c0de8354515"} Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.271933 5004 generic.go:334] "Generic (PLEG): container finished" podID="00280e57-6f31-4a39-b245-d5a795277c07" containerID="ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" exitCode=0 Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.271972 5004 generic.go:334] "Generic (PLEG): container finished" podID="00280e57-6f31-4a39-b245-d5a795277c07" containerID="96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" exitCode=143 Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.272022 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerDied","Data":"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea"} Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.272056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerDied","Data":"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e"} Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.272071 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00280e57-6f31-4a39-b245-d5a795277c07","Type":"ContainerDied","Data":"619455ad30f20598269bfdf0c2343f8a8c9f84cef308d8e62368979590baf6b8"} Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.272089 5004 scope.go:117] "RemoveContainer" containerID="ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.272134 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.276202 5004 generic.go:334] "Generic (PLEG): container finished" podID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerID="ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4" exitCode=143 Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.276235 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerDied","Data":"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4"} Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.295444 5004 scope.go:117] "RemoveContainer" containerID="96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.305173 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle\") pod \"00280e57-6f31-4a39-b245-d5a795277c07\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.305384 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs\") pod \"00280e57-6f31-4a39-b245-d5a795277c07\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.305465 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws7rs\" (UniqueName: \"kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs\") pod \"00280e57-6f31-4a39-b245-d5a795277c07\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.305504 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs\") pod \"00280e57-6f31-4a39-b245-d5a795277c07\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.305535 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data\") pod \"00280e57-6f31-4a39-b245-d5a795277c07\" (UID: \"00280e57-6f31-4a39-b245-d5a795277c07\") " Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.307072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs" (OuterVolumeSpecName: "logs") pod "00280e57-6f31-4a39-b245-d5a795277c07" (UID: "00280e57-6f31-4a39-b245-d5a795277c07"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.309509 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs" (OuterVolumeSpecName: "kube-api-access-ws7rs") pod "00280e57-6f31-4a39-b245-d5a795277c07" (UID: "00280e57-6f31-4a39-b245-d5a795277c07"). InnerVolumeSpecName "kube-api-access-ws7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.317156 5004 scope.go:117] "RemoveContainer" containerID="ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" Dec 03 14:29:18 crc kubenswrapper[5004]: E1203 14:29:18.318846 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea\": container with ID starting with ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea not found: ID does not exist" containerID="ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.318913 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea"} err="failed to get container status \"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea\": rpc error: code = NotFound desc = could not find container \"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea\": container with ID starting with ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea not found: ID does not exist" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.318933 5004 scope.go:117] "RemoveContainer" containerID="96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" Dec 03 14:29:18 crc kubenswrapper[5004]: E1203 14:29:18.319474 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e\": container with ID starting with 96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e not found: ID does not exist" containerID="96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.319528 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e"} err="failed to get container status \"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e\": rpc error: code = NotFound desc = could not find container \"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e\": container with ID starting with 96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e not found: ID does not exist" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.319548 5004 scope.go:117] "RemoveContainer" containerID="ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.320017 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea"} err="failed to get container status \"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea\": rpc error: code = NotFound desc = could not find container \"ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea\": container with ID starting with ce2f8bdc0e1881281629ad26c1b179635e8aeb0e7d8f707920afbac9716572ea not found: ID does not exist" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.320064 5004 scope.go:117] "RemoveContainer" containerID="96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.320329 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e"} err="failed to get container status \"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e\": rpc error: code = NotFound desc = could not find container \"96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e\": container with ID starting with 96c4c74f232b9ebbcecd636d998ca9a91be1f3f54eee184bc920595cf21cfd4e not found: ID does not exist" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.338688 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00280e57-6f31-4a39-b245-d5a795277c07" (UID: "00280e57-6f31-4a39-b245-d5a795277c07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.342238 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data" (OuterVolumeSpecName: "config-data") pod "00280e57-6f31-4a39-b245-d5a795277c07" (UID: "00280e57-6f31-4a39-b245-d5a795277c07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.382177 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "00280e57-6f31-4a39-b245-d5a795277c07" (UID: "00280e57-6f31-4a39-b245-d5a795277c07"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.407801 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00280e57-6f31-4a39-b245-d5a795277c07-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.407840 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws7rs\" (UniqueName: \"kubernetes.io/projected/00280e57-6f31-4a39-b245-d5a795277c07-kube-api-access-ws7rs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.407908 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.407918 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.407928 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00280e57-6f31-4a39-b245-d5a795277c07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.602695 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.613851 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.629054 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:18 crc kubenswrapper[5004]: E1203 14:29:18.629451 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-log" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.629470 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-log" Dec 03 14:29:18 crc kubenswrapper[5004]: E1203 14:29:18.629483 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-metadata" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.629489 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-metadata" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.629664 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-log" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.629697 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="00280e57-6f31-4a39-b245-d5a795277c07" containerName="nova-metadata-metadata" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.630627 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.633431 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.633616 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.640100 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.712562 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.713026 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.713161 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnpjx\" (UniqueName: \"kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.713218 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.713337 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.814944 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.814988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.815043 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnpjx\" (UniqueName: \"kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.815067 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.815136 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.815795 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.819430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.819653 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.820190 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:18 crc kubenswrapper[5004]: I1203 14:29:18.843787 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnpjx\" (UniqueName: \"kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx\") pod \"nova-metadata-0\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " pod="openstack/nova-metadata-0" Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.058915 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.289054 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"234c4127-5836-4628-a426-2644c4df71a1","Type":"ContainerStarted","Data":"00d8e1d09a0440d977e83e3a014a5feb041657b529380ab77fe2b4762c692821"} Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.290201 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.316155 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.316135311 podStartE2EDuration="2.316135311s" podCreationTimestamp="2025-12-03 14:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:19.308141314 +0000 UTC m=+1372.057111550" watchObservedRunningTime="2025-12-03 14:29:19.316135311 +0000 UTC m=+1372.065105547" Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.539782 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:29:19 crc kubenswrapper[5004]: I1203 14:29:19.629432 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00280e57-6f31-4a39-b245-d5a795277c07" path="/var/lib/kubelet/pods/00280e57-6f31-4a39-b245-d5a795277c07/volumes" Dec 03 14:29:19 crc kubenswrapper[5004]: E1203 14:29:19.654109 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:29:19 crc kubenswrapper[5004]: E1203 14:29:19.656250 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:29:19 crc kubenswrapper[5004]: E1203 14:29:19.660468 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:29:19 crc kubenswrapper[5004]: E1203 14:29:19.660536 5004 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerName="nova-scheduler-scheduler" Dec 03 14:29:20 crc kubenswrapper[5004]: I1203 14:29:20.301337 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerStarted","Data":"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435"} Dec 03 14:29:20 crc kubenswrapper[5004]: I1203 14:29:20.301691 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerStarted","Data":"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062"} Dec 03 14:29:20 crc kubenswrapper[5004]: I1203 14:29:20.301707 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerStarted","Data":"f2b7a3b3efa89b66c510f5fd0b89653d5f6e7fdd44222dcb4c81ad5fa3d19b04"} Dec 03 14:29:20 crc kubenswrapper[5004]: I1203 14:29:20.329125 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.329098966 podStartE2EDuration="2.329098966s" podCreationTimestamp="2025-12-03 14:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:20.318353332 +0000 UTC m=+1373.067323568" watchObservedRunningTime="2025-12-03 14:29:20.329098966 +0000 UTC m=+1373.078069202" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.158720 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.279517 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle\") pod \"fc921088-fea3-4c7a-95e1-a6b493d769a8\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.279608 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcjk8\" (UniqueName: \"kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8\") pod \"fc921088-fea3-4c7a-95e1-a6b493d769a8\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.279646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data\") pod \"fc921088-fea3-4c7a-95e1-a6b493d769a8\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.279709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs\") pod \"fc921088-fea3-4c7a-95e1-a6b493d769a8\" (UID: \"fc921088-fea3-4c7a-95e1-a6b493d769a8\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.280606 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs" (OuterVolumeSpecName: "logs") pod "fc921088-fea3-4c7a-95e1-a6b493d769a8" (UID: "fc921088-fea3-4c7a-95e1-a6b493d769a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.291330 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8" (OuterVolumeSpecName: "kube-api-access-dcjk8") pod "fc921088-fea3-4c7a-95e1-a6b493d769a8" (UID: "fc921088-fea3-4c7a-95e1-a6b493d769a8"). InnerVolumeSpecName "kube-api-access-dcjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.313297 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerDied","Data":"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c"} Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.313335 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.313375 5004 scope.go:117] "RemoveContainer" containerID="0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.313609 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc921088-fea3-4c7a-95e1-a6b493d769a8" (UID: "fc921088-fea3-4c7a-95e1-a6b493d769a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.314467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data" (OuterVolumeSpecName: "config-data") pod "fc921088-fea3-4c7a-95e1-a6b493d769a8" (UID: "fc921088-fea3-4c7a-95e1-a6b493d769a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.313178 5004 generic.go:334] "Generic (PLEG): container finished" podID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerID="0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c" exitCode=0 Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.320049 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc921088-fea3-4c7a-95e1-a6b493d769a8","Type":"ContainerDied","Data":"19595152d08375d51cea88867cd8ae7f6bf3923b89c62d6b93023d9125443748"} Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.383478 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.384086 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc921088-fea3-4c7a-95e1-a6b493d769a8-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.384105 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc921088-fea3-4c7a-95e1-a6b493d769a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.384121 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcjk8\" (UniqueName: \"kubernetes.io/projected/fc921088-fea3-4c7a-95e1-a6b493d769a8-kube-api-access-dcjk8\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.481462 5004 scope.go:117] "RemoveContainer" containerID="ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.503728 5004 scope.go:117] "RemoveContainer" containerID="0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c" Dec 03 14:29:21 crc kubenswrapper[5004]: E1203 14:29:21.504317 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c\": container with ID starting with 0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c not found: ID does not exist" containerID="0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.504371 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c"} err="failed to get container status \"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c\": rpc error: code = NotFound desc = could not find container \"0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c\": container with ID starting with 0e99dbb06591de9c8ba39029ef30b75d91338b515c54c309ff2c64648851b53c not found: ID does not exist" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.504403 5004 scope.go:117] "RemoveContainer" containerID="ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4" Dec 03 14:29:21 crc kubenswrapper[5004]: E1203 14:29:21.504704 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4\": container with ID starting with ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4 not found: ID does not exist" containerID="ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.504729 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4"} err="failed to get container status \"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4\": rpc error: code = NotFound desc = could not find container \"ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4\": container with ID starting with ce0244ba295bc71e990722244beeed4e83fb730aa5d4bb9f2dd516a7b05cc2f4 not found: ID does not exist" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.623135 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.686013 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.689582 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ndtl\" (UniqueName: \"kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl\") pod \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.689672 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle\") pod \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.689710 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data\") pod \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\" (UID: \"9ad616f1-74f7-4f52-b86b-3a10abd23a10\") " Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.693927 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl" (OuterVolumeSpecName: "kube-api-access-4ndtl") pod "9ad616f1-74f7-4f52-b86b-3a10abd23a10" (UID: "9ad616f1-74f7-4f52-b86b-3a10abd23a10"). InnerVolumeSpecName "kube-api-access-4ndtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.697956 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.711500 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:21 crc kubenswrapper[5004]: E1203 14:29:21.712302 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerName="nova-scheduler-scheduler" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712329 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerName="nova-scheduler-scheduler" Dec 03 14:29:21 crc kubenswrapper[5004]: E1203 14:29:21.712370 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-api" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712378 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-api" Dec 03 14:29:21 crc kubenswrapper[5004]: E1203 14:29:21.712390 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-log" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712397 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-log" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712707 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-api" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712754 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerName="nova-scheduler-scheduler" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.712780 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" containerName="nova-api-log" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.714737 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.716482 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data" (OuterVolumeSpecName: "config-data") pod "9ad616f1-74f7-4f52-b86b-3a10abd23a10" (UID: "9ad616f1-74f7-4f52-b86b-3a10abd23a10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.717756 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.720039 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ad616f1-74f7-4f52-b86b-3a10abd23a10" (UID: "9ad616f1-74f7-4f52-b86b-3a10abd23a10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.721577 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.792453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.792732 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.792891 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.792953 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsz4z\" (UniqueName: \"kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.793140 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ndtl\" (UniqueName: \"kubernetes.io/projected/9ad616f1-74f7-4f52-b86b-3a10abd23a10-kube-api-access-4ndtl\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.793158 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.793167 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ad616f1-74f7-4f52-b86b-3a10abd23a10-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.896100 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.896286 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.896455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.897200 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.897912 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsz4z\" (UniqueName: \"kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.900104 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.900366 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:21 crc kubenswrapper[5004]: I1203 14:29:21.914212 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsz4z\" (UniqueName: \"kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z\") pod \"nova-api-0\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " pod="openstack/nova-api-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.040468 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.337543 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" exitCode=0 Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.337602 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.337600 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ad616f1-74f7-4f52-b86b-3a10abd23a10","Type":"ContainerDied","Data":"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5"} Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.338254 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9ad616f1-74f7-4f52-b86b-3a10abd23a10","Type":"ContainerDied","Data":"877eb5faf48a200419e9ce4d5c94762e4584122e2c67544639d3ff36fe1b1ec0"} Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.338290 5004 scope.go:117] "RemoveContainer" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.377682 5004 scope.go:117] "RemoveContainer" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" Dec 03 14:29:22 crc kubenswrapper[5004]: E1203 14:29:22.380347 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5\": container with ID starting with ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5 not found: ID does not exist" containerID="ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.380470 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5"} err="failed to get container status \"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5\": rpc error: code = NotFound desc = could not find container \"ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5\": container with ID starting with ab0fe40c0cec1c1171f2824253f392011ef10cfa0092d47153862d6ddc0272f5 not found: ID does not exist" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.382009 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.394685 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.413306 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.415120 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.417102 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.428640 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.514062 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.514279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56g4q\" (UniqueName: \"kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.514430 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.514517 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.616098 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56g4q\" (UniqueName: \"kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.616281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.616367 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.622797 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.623551 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.632263 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56g4q\" (UniqueName: \"kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q\") pod \"nova-scheduler-0\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " pod="openstack/nova-scheduler-0" Dec 03 14:29:22 crc kubenswrapper[5004]: I1203 14:29:22.774170 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.191591 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.352195 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad294a12-4be8-4326-8f8e-8aec9157343d","Type":"ContainerStarted","Data":"cb40f856ea3b05760057176d3c62b095b808925640f49de1fb957aa4eaff68e3"} Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.353763 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerStarted","Data":"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce"} Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.353780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerStarted","Data":"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf"} Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.353791 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerStarted","Data":"81c4989c1ab7c7784d57787cbb556b475a779b65869d345253135b9611e12f9a"} Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.376702 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.376685198 podStartE2EDuration="2.376685198s" podCreationTimestamp="2025-12-03 14:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:23.372250552 +0000 UTC m=+1376.121220808" watchObservedRunningTime="2025-12-03 14:29:23.376685198 +0000 UTC m=+1376.125655434" Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.625789 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad616f1-74f7-4f52-b86b-3a10abd23a10" path="/var/lib/kubelet/pods/9ad616f1-74f7-4f52-b86b-3a10abd23a10/volumes" Dec 03 14:29:23 crc kubenswrapper[5004]: I1203 14:29:23.626654 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc921088-fea3-4c7a-95e1-a6b493d769a8" path="/var/lib/kubelet/pods/fc921088-fea3-4c7a-95e1-a6b493d769a8/volumes" Dec 03 14:29:24 crc kubenswrapper[5004]: I1203 14:29:24.059572 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:29:24 crc kubenswrapper[5004]: I1203 14:29:24.059656 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:29:24 crc kubenswrapper[5004]: I1203 14:29:24.368449 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad294a12-4be8-4326-8f8e-8aec9157343d","Type":"ContainerStarted","Data":"579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68"} Dec 03 14:29:24 crc kubenswrapper[5004]: I1203 14:29:24.398665 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.398640968 podStartE2EDuration="2.398640968s" podCreationTimestamp="2025-12-03 14:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:24.386929186 +0000 UTC m=+1377.135899442" watchObservedRunningTime="2025-12-03 14:29:24.398640968 +0000 UTC m=+1377.147611204" Dec 03 14:29:27 crc kubenswrapper[5004]: I1203 14:29:27.678746 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 14:29:27 crc kubenswrapper[5004]: I1203 14:29:27.774509 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:29:29 crc kubenswrapper[5004]: I1203 14:29:29.060197 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:29:29 crc kubenswrapper[5004]: I1203 14:29:29.060649 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:29:30 crc kubenswrapper[5004]: I1203 14:29:30.075133 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:30 crc kubenswrapper[5004]: I1203 14:29:30.075156 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:32 crc kubenswrapper[5004]: I1203 14:29:32.041007 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:29:32 crc kubenswrapper[5004]: I1203 14:29:32.041441 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:29:32 crc kubenswrapper[5004]: I1203 14:29:32.774645 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:29:32 crc kubenswrapper[5004]: I1203 14:29:32.803972 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:29:33 crc kubenswrapper[5004]: I1203 14:29:33.123079 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:33 crc kubenswrapper[5004]: I1203 14:29:33.123081 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 14:29:33 crc kubenswrapper[5004]: I1203 14:29:33.482293 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:29:39 crc kubenswrapper[5004]: I1203 14:29:39.065591 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:29:39 crc kubenswrapper[5004]: I1203 14:29:39.067046 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:29:39 crc kubenswrapper[5004]: I1203 14:29:39.071621 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:29:39 crc kubenswrapper[5004]: I1203 14:29:39.071817 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.512292 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.512751 5004 generic.go:334] "Generic (PLEG): container finished" podID="599482f5-af51-4a34-abc3-97ce21f8b6dd" containerID="5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75" exitCode=137 Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.512811 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"599482f5-af51-4a34-abc3-97ce21f8b6dd","Type":"ContainerDied","Data":"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75"} Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.512846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"599482f5-af51-4a34-abc3-97ce21f8b6dd","Type":"ContainerDied","Data":"8cd39621f2bd38ef8a47bb781eeb3c0559aa122db21350f67c1e56f0df7cb672"} Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.512883 5004 scope.go:117] "RemoveContainer" containerID="5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.541199 5004 scope.go:117] "RemoveContainer" containerID="5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75" Dec 03 14:29:40 crc kubenswrapper[5004]: E1203 14:29:40.541625 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75\": container with ID starting with 5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75 not found: ID does not exist" containerID="5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.541679 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75"} err="failed to get container status \"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75\": rpc error: code = NotFound desc = could not find container \"5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75\": container with ID starting with 5d51fecb665352c36b3d46626a98fb13c5003451c46a65785be5ffcaedb5ad75 not found: ID does not exist" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.573742 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-827dq\" (UniqueName: \"kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq\") pod \"599482f5-af51-4a34-abc3-97ce21f8b6dd\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.573922 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle\") pod \"599482f5-af51-4a34-abc3-97ce21f8b6dd\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.573996 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data\") pod \"599482f5-af51-4a34-abc3-97ce21f8b6dd\" (UID: \"599482f5-af51-4a34-abc3-97ce21f8b6dd\") " Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.587251 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq" (OuterVolumeSpecName: "kube-api-access-827dq") pod "599482f5-af51-4a34-abc3-97ce21f8b6dd" (UID: "599482f5-af51-4a34-abc3-97ce21f8b6dd"). InnerVolumeSpecName "kube-api-access-827dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.607072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data" (OuterVolumeSpecName: "config-data") pod "599482f5-af51-4a34-abc3-97ce21f8b6dd" (UID: "599482f5-af51-4a34-abc3-97ce21f8b6dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.610526 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "599482f5-af51-4a34-abc3-97ce21f8b6dd" (UID: "599482f5-af51-4a34-abc3-97ce21f8b6dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.677309 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-827dq\" (UniqueName: \"kubernetes.io/projected/599482f5-af51-4a34-abc3-97ce21f8b6dd-kube-api-access-827dq\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.677345 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.677356 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599482f5-af51-4a34-abc3-97ce21f8b6dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:40 crc kubenswrapper[5004]: I1203 14:29:40.702402 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.523112 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.558125 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.572189 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.581104 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:41 crc kubenswrapper[5004]: E1203 14:29:41.581662 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599482f5-af51-4a34-abc3-97ce21f8b6dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.581688 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="599482f5-af51-4a34-abc3-97ce21f8b6dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.581946 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="599482f5-af51-4a34-abc3-97ce21f8b6dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.582805 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.584897 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.585380 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.585381 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.590734 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.626589 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599482f5-af51-4a34-abc3-97ce21f8b6dd" path="/var/lib/kubelet/pods/599482f5-af51-4a34-abc3-97ce21f8b6dd/volumes" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.696825 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.696911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbg5g\" (UniqueName: \"kubernetes.io/projected/182237d5-f265-4577-8b9a-51f4e2a64a6a-kube-api-access-pbg5g\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.696952 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.696987 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.697204 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.799446 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.799507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbg5g\" (UniqueName: \"kubernetes.io/projected/182237d5-f265-4577-8b9a-51f4e2a64a6a-kube-api-access-pbg5g\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.799544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.799583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.799639 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.805416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.805684 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.806254 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.806700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/182237d5-f265-4577-8b9a-51f4e2a64a6a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.827610 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbg5g\" (UniqueName: \"kubernetes.io/projected/182237d5-f265-4577-8b9a-51f4e2a64a6a-kube-api-access-pbg5g\") pod \"nova-cell1-novncproxy-0\" (UID: \"182237d5-f265-4577-8b9a-51f4e2a64a6a\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:41 crc kubenswrapper[5004]: I1203 14:29:41.933909 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.045699 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.046262 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.046363 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.049900 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.381591 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 14:29:42 crc kubenswrapper[5004]: W1203 14:29:42.381681 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod182237d5_f265_4577_8b9a_51f4e2a64a6a.slice/crio-3c2dd531ee683ce898cc67d6905e8894689bcd27c14228fefb6353f16b79ecea WatchSource:0}: Error finding container 3c2dd531ee683ce898cc67d6905e8894689bcd27c14228fefb6353f16b79ecea: Status 404 returned error can't find the container with id 3c2dd531ee683ce898cc67d6905e8894689bcd27c14228fefb6353f16b79ecea Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.533601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"182237d5-f265-4577-8b9a-51f4e2a64a6a","Type":"ContainerStarted","Data":"3c2dd531ee683ce898cc67d6905e8894689bcd27c14228fefb6353f16b79ecea"} Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.533987 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.538106 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.726233 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.727892 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.750433 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821629 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821682 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821703 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821737 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlbw\" (UniqueName: \"kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.821753 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923565 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923647 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923687 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923731 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlbw\" (UniqueName: \"kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.923747 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.924739 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.924812 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.924816 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.925052 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.925568 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:42 crc kubenswrapper[5004]: I1203 14:29:42.946896 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlbw\" (UniqueName: \"kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw\") pod \"dnsmasq-dns-89c5cd4d5-qkvj5\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:43 crc kubenswrapper[5004]: I1203 14:29:43.052704 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:43 crc kubenswrapper[5004]: I1203 14:29:43.529925 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:29:43 crc kubenswrapper[5004]: I1203 14:29:43.574909 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"182237d5-f265-4577-8b9a-51f4e2a64a6a","Type":"ContainerStarted","Data":"1dcf37aca429d7dfffb6bd91f75711dd5a62daf46451a528423e6d789ad706e0"} Dec 03 14:29:43 crc kubenswrapper[5004]: I1203 14:29:43.608098 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.60807381 podStartE2EDuration="2.60807381s" podCreationTimestamp="2025-12-03 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:43.59608619 +0000 UTC m=+1396.345056426" watchObservedRunningTime="2025-12-03 14:29:43.60807381 +0000 UTC m=+1396.357044046" Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.584130 5004 generic.go:334] "Generic (PLEG): container finished" podID="d34e472d-b443-4e4f-9843-694db62e3394" containerID="e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53" exitCode=0 Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.584194 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" event={"ID":"d34e472d-b443-4e4f-9843-694db62e3394","Type":"ContainerDied","Data":"e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53"} Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.584408 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" event={"ID":"d34e472d-b443-4e4f-9843-694db62e3394","Type":"ContainerStarted","Data":"23f1cbd8e2b40b809655fcc25655a917d21454b937fb4215b1e43ebc974cf9b6"} Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.788456 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.788716 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-central-agent" containerID="cri-o://d9c850007896cdecf9b4c1dd9cba5aff97d51dfd677c7671274f8c31cd082f14" gracePeriod=30 Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.788787 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="sg-core" containerID="cri-o://71b052ec96176853303050a6281debef555d5753e0e2354438e197b3b0d0ddfc" gracePeriod=30 Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.788798 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-notification-agent" containerID="cri-o://2f2f6431eb0362b1cb23c5cfa682ee7d8833ffa72440e0f16237398c5045092f" gracePeriod=30 Dec 03 14:29:44 crc kubenswrapper[5004]: I1203 14:29:44.788787 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="proxy-httpd" containerID="cri-o://3383b83f8cdf1f3577cb75034c5419f8764fb381da9ff2f10f5ad0a047d3e071" gracePeriod=30 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.291939 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.623377 5004 generic.go:334] "Generic (PLEG): container finished" podID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerID="3383b83f8cdf1f3577cb75034c5419f8764fb381da9ff2f10f5ad0a047d3e071" exitCode=0 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.623415 5004 generic.go:334] "Generic (PLEG): container finished" podID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerID="71b052ec96176853303050a6281debef555d5753e0e2354438e197b3b0d0ddfc" exitCode=2 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.623423 5004 generic.go:334] "Generic (PLEG): container finished" podID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerID="d9c850007896cdecf9b4c1dd9cba5aff97d51dfd677c7671274f8c31cd082f14" exitCode=0 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.628265 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-log" containerID="cri-o://95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf" gracePeriod=30 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.628667 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-api" containerID="cri-o://179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce" gracePeriod=30 Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.631126 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerDied","Data":"3383b83f8cdf1f3577cb75034c5419f8764fb381da9ff2f10f5ad0a047d3e071"} Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.631187 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerDied","Data":"71b052ec96176853303050a6281debef555d5753e0e2354438e197b3b0d0ddfc"} Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.631205 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerDied","Data":"d9c850007896cdecf9b4c1dd9cba5aff97d51dfd677c7671274f8c31cd082f14"} Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.631218 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" event={"ID":"d34e472d-b443-4e4f-9843-694db62e3394","Type":"ContainerStarted","Data":"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c"} Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.631241 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:45 crc kubenswrapper[5004]: I1203 14:29:45.664077 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" podStartSLOduration=3.664055861 podStartE2EDuration="3.664055861s" podCreationTimestamp="2025-12-03 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:45.653555944 +0000 UTC m=+1398.402526180" watchObservedRunningTime="2025-12-03 14:29:45.664055861 +0000 UTC m=+1398.413026097" Dec 03 14:29:46 crc kubenswrapper[5004]: I1203 14:29:46.640957 5004 generic.go:334] "Generic (PLEG): container finished" podID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerID="95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf" exitCode=143 Dec 03 14:29:46 crc kubenswrapper[5004]: I1203 14:29:46.641006 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerDied","Data":"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf"} Dec 03 14:29:46 crc kubenswrapper[5004]: I1203 14:29:46.934695 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.269060 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.339910 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data\") pod \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.339978 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs\") pod \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.340189 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle\") pod \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.340301 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsz4z\" (UniqueName: \"kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z\") pod \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\" (UID: \"c1cb216c-80fa-441b-a9eb-e05d672ba3b6\") " Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.342367 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs" (OuterVolumeSpecName: "logs") pod "c1cb216c-80fa-441b-a9eb-e05d672ba3b6" (UID: "c1cb216c-80fa-441b-a9eb-e05d672ba3b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.350127 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z" (OuterVolumeSpecName: "kube-api-access-xsz4z") pod "c1cb216c-80fa-441b-a9eb-e05d672ba3b6" (UID: "c1cb216c-80fa-441b-a9eb-e05d672ba3b6"). InnerVolumeSpecName "kube-api-access-xsz4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.381132 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data" (OuterVolumeSpecName: "config-data") pod "c1cb216c-80fa-441b-a9eb-e05d672ba3b6" (UID: "c1cb216c-80fa-441b-a9eb-e05d672ba3b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.404145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1cb216c-80fa-441b-a9eb-e05d672ba3b6" (UID: "c1cb216c-80fa-441b-a9eb-e05d672ba3b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.442318 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.442354 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsz4z\" (UniqueName: \"kubernetes.io/projected/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-kube-api-access-xsz4z\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.442369 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.442378 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cb216c-80fa-441b-a9eb-e05d672ba3b6-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.682506 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.682560 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerDied","Data":"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce"} Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.682623 5004 scope.go:117] "RemoveContainer" containerID="179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.682413 5004 generic.go:334] "Generic (PLEG): container finished" podID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerID="179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce" exitCode=0 Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.683664 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c1cb216c-80fa-441b-a9eb-e05d672ba3b6","Type":"ContainerDied","Data":"81c4989c1ab7c7784d57787cbb556b475a779b65869d345253135b9611e12f9a"} Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.686920 5004 generic.go:334] "Generic (PLEG): container finished" podID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerID="2f2f6431eb0362b1cb23c5cfa682ee7d8833ffa72440e0f16237398c5045092f" exitCode=0 Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.686973 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerDied","Data":"2f2f6431eb0362b1cb23c5cfa682ee7d8833ffa72440e0f16237398c5045092f"} Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.706730 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.719318 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.734055 5004 scope.go:117] "RemoveContainer" containerID="95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.754304 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:49 crc kubenswrapper[5004]: E1203 14:29:49.756188 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-api" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.756293 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-api" Dec 03 14:29:49 crc kubenswrapper[5004]: E1203 14:29:49.756424 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-log" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.756493 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-log" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.757398 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-log" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.757515 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" containerName="nova-api-api" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.761845 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.764579 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.772912 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.773149 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.778703 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.780837 5004 scope.go:117] "RemoveContainer" containerID="179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce" Dec 03 14:29:49 crc kubenswrapper[5004]: E1203 14:29:49.781770 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce\": container with ID starting with 179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce not found: ID does not exist" containerID="179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.781818 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce"} err="failed to get container status \"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce\": rpc error: code = NotFound desc = could not find container \"179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce\": container with ID starting with 179da047a252fadebe424316cec50864b7385c9a046aeff0fcc63250a1722dce not found: ID does not exist" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.781851 5004 scope.go:117] "RemoveContainer" containerID="95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf" Dec 03 14:29:49 crc kubenswrapper[5004]: E1203 14:29:49.782964 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf\": container with ID starting with 95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf not found: ID does not exist" containerID="95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.783005 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf"} err="failed to get container status \"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf\": rpc error: code = NotFound desc = could not find container \"95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf\": container with ID starting with 95019b43c4b491e5b658328d18e16c0a2ef3ccc24c552b4e9a32982c0d83e8bf not found: ID does not exist" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.852606 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.852682 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.852721 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.852813 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm7v\" (UniqueName: \"kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.852947 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.853055 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955142 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955217 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955249 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm7v\" (UniqueName: \"kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955311 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.955351 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.956987 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.959622 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.961483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.969457 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.979053 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:49 crc kubenswrapper[5004]: I1203 14:29:49.989147 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm7v\" (UniqueName: \"kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v\") pod \"nova-api-0\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " pod="openstack/nova-api-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.072619 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.104320 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.156179 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.156386 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" containerName="kube-state-metrics" containerID="cri-o://c69d72633e6999d234db76423d2f8667cbed3c362e71754589644013ec0bb721" gracePeriod=30 Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159033 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159106 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159142 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159187 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159243 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159258 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159301 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xk8\" (UniqueName: \"kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.159928 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.160814 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.164156 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts" (OuterVolumeSpecName: "scripts") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.175542 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8" (OuterVolumeSpecName: "kube-api-access-49xk8") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "kube-api-access-49xk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.269520 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.270684 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") pod \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\" (UID: \"a4d3db0f-5e76-4823-a586-cb6caa9e3b54\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.271351 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.271364 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.271373 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.271381 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49xk8\" (UniqueName: \"kubernetes.io/projected/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-kube-api-access-49xk8\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: W1203 14:29:50.271460 5004 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a4d3db0f-5e76-4823-a586-cb6caa9e3b54/volumes/kubernetes.io~secret/sg-core-conf-yaml Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.271471 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.379754 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.381898 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.382017 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.468401 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data" (OuterVolumeSpecName: "config-data") pod "a4d3db0f-5e76-4823-a586-cb6caa9e3b54" (UID: "a4d3db0f-5e76-4823-a586-cb6caa9e3b54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.483762 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d3db0f-5e76-4823-a586-cb6caa9e3b54-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.703569 5004 generic.go:334] "Generic (PLEG): container finished" podID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" containerID="c69d72633e6999d234db76423d2f8667cbed3c362e71754589644013ec0bb721" exitCode=2 Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.703648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd","Type":"ContainerDied","Data":"c69d72633e6999d234db76423d2f8667cbed3c362e71754589644013ec0bb721"} Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.707943 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4d3db0f-5e76-4823-a586-cb6caa9e3b54","Type":"ContainerDied","Data":"b65035c9f81aa806069851a2861bfe256366e30236aa99cb00eb5f728cb153c0"} Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.707999 5004 scope.go:117] "RemoveContainer" containerID="3383b83f8cdf1f3577cb75034c5419f8764fb381da9ff2f10f5ad0a047d3e071" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.708137 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.748922 5004 scope.go:117] "RemoveContainer" containerID="71b052ec96176853303050a6281debef555d5753e0e2354438e197b3b0d0ddfc" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.750166 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.762460 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.772655 5004 scope.go:117] "RemoveContainer" containerID="2f2f6431eb0362b1cb23c5cfa682ee7d8833ffa72440e0f16237398c5045092f" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.779924 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.787499 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: E1203 14:29:50.787980 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="proxy-httpd" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788006 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="proxy-httpd" Dec 03 14:29:50 crc kubenswrapper[5004]: E1203 14:29:50.788039 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-central-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788048 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-central-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: E1203 14:29:50.788067 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="sg-core" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788075 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="sg-core" Dec 03 14:29:50 crc kubenswrapper[5004]: E1203 14:29:50.788089 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-notification-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788098 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-notification-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788304 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-notification-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788325 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="sg-core" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788348 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="proxy-httpd" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.788369 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" containerName="ceilometer-central-agent" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.815167 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.815294 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.817821 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.825619 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.828979 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.864799 5004 scope.go:117] "RemoveContainer" containerID="d9c850007896cdecf9b4c1dd9cba5aff97d51dfd677c7671274f8c31cd082f14" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.890842 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcjt\" (UniqueName: \"kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt\") pod \"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd\" (UID: \"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd\") " Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891106 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n8v5\" (UniqueName: \"kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891189 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891210 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891234 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891275 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891300 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.891364 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.897745 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt" (OuterVolumeSpecName: "kube-api-access-trcjt") pod "8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" (UID: "8a8a68f6-e00d-4d25-87ac-aa973e7e44cd"). InnerVolumeSpecName "kube-api-access-trcjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.994715 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995221 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995253 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995302 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995392 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995438 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n8v5\" (UniqueName: \"kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.995502 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcjt\" (UniqueName: \"kubernetes.io/projected/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd-kube-api-access-trcjt\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.996382 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:50 crc kubenswrapper[5004]: I1203 14:29:50.997127 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.000712 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.001396 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.004610 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.009274 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.013877 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n8v5\" (UniqueName: \"kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5\") pod \"ceilometer-0\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.299838 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.622167 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d3db0f-5e76-4823-a586-cb6caa9e3b54" path="/var/lib/kubelet/pods/a4d3db0f-5e76-4823-a586-cb6caa9e3b54/volumes" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.623299 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1cb216c-80fa-441b-a9eb-e05d672ba3b6" path="/var/lib/kubelet/pods/c1cb216c-80fa-441b-a9eb-e05d672ba3b6/volumes" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.733030 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerStarted","Data":"a15c3295c1dcc4fb4c273859c3721801a4f74ede1d975e7a8169766989250c8c"} Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.733333 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerStarted","Data":"56fe51ab5327e69a4e4db3a7752dd199995e19892d15cf1344a02060b67a1bb2"} Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.733350 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerStarted","Data":"b4108f810acaad5f959fb7b4c99d3bad040e710231bf449ffe88531fc299c4ba"} Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.735487 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.735479 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a8a68f6-e00d-4d25-87ac-aa973e7e44cd","Type":"ContainerDied","Data":"e243d8535961cf55fafe3e2da7aa1ed788b2b7642d3d4649aa9f7807e0ed9ef7"} Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.736005 5004 scope.go:117] "RemoveContainer" containerID="c69d72633e6999d234db76423d2f8667cbed3c362e71754589644013ec0bb721" Dec 03 14:29:51 crc kubenswrapper[5004]: W1203 14:29:51.769153 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44f095fe_f931_4dd8_8c1b_9f3e67c9a33e.slice/crio-ea7d0af2a5bd48c385d830e9d20bf187efcd76d813150c157546adde31d72d00 WatchSource:0}: Error finding container ea7d0af2a5bd48c385d830e9d20bf187efcd76d813150c157546adde31d72d00: Status 404 returned error can't find the container with id ea7d0af2a5bd48c385d830e9d20bf187efcd76d813150c157546adde31d72d00 Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.773511 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.780075 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.780056515 podStartE2EDuration="2.780056515s" podCreationTimestamp="2025-12-03 14:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:51.761678064 +0000 UTC m=+1404.510648310" watchObservedRunningTime="2025-12-03 14:29:51.780056515 +0000 UTC m=+1404.529026751" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.808922 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.825030 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.836196 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:51 crc kubenswrapper[5004]: E1203 14:29:51.836575 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" containerName="kube-state-metrics" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.836586 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" containerName="kube-state-metrics" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.836742 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" containerName="kube-state-metrics" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.837397 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.840148 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.840441 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.848417 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.915115 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fb5\" (UniqueName: \"kubernetes.io/projected/70b42ea8-681a-44cb-a494-2093b925d015-kube-api-access-45fb5\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.915159 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.915195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.915307 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.935061 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:51 crc kubenswrapper[5004]: I1203 14:29:51.958085 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.038691 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.038806 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.038937 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fb5\" (UniqueName: \"kubernetes.io/projected/70b42ea8-681a-44cb-a494-2093b925d015-kube-api-access-45fb5\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.038958 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.044080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.044629 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.044650 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/70b42ea8-681a-44cb-a494-2093b925d015-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.062664 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fb5\" (UniqueName: \"kubernetes.io/projected/70b42ea8-681a-44cb-a494-2093b925d015-kube-api-access-45fb5\") pod \"kube-state-metrics-0\" (UID: \"70b42ea8-681a-44cb-a494-2093b925d015\") " pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.212563 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.307804 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.707960 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 14:29:52 crc kubenswrapper[5004]: W1203 14:29:52.710387 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70b42ea8_681a_44cb_a494_2093b925d015.slice/crio-09d3bc4adfc5e6384952b0c864adb31f3ee522e9de32be23da0f185f1984f656 WatchSource:0}: Error finding container 09d3bc4adfc5e6384952b0c864adb31f3ee522e9de32be23da0f185f1984f656: Status 404 returned error can't find the container with id 09d3bc4adfc5e6384952b0c864adb31f3ee522e9de32be23da0f185f1984f656 Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.746115 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"70b42ea8-681a-44cb-a494-2093b925d015","Type":"ContainerStarted","Data":"09d3bc4adfc5e6384952b0c864adb31f3ee522e9de32be23da0f185f1984f656"} Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.749604 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerStarted","Data":"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50"} Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.749670 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerStarted","Data":"ea7d0af2a5bd48c385d830e9d20bf187efcd76d813150c157546adde31d72d00"} Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.765187 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.823998 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.824050 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.968277 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nqm7s"] Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.970194 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.973196 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.973286 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 14:29:52 crc kubenswrapper[5004]: I1203 14:29:52.978403 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nqm7s"] Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.055066 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.069875 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.069935 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.069988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhjc8\" (UniqueName: \"kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.070042 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.123950 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.124211 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="dnsmasq-dns" containerID="cri-o://322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1" gracePeriod=10 Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.176356 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.177325 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.177429 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.177555 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhjc8\" (UniqueName: \"kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.182576 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.184486 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.184664 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.200600 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhjc8\" (UniqueName: \"kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8\") pod \"nova-cell1-cell-mapping-nqm7s\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.292715 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.581222 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.637131 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8a68f6-e00d-4d25-87ac-aa973e7e44cd" path="/var/lib/kubelet/pods/8a8a68f6-e00d-4d25-87ac-aa973e7e44cd/volumes" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.686701 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.687126 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.687161 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.687216 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnsm6\" (UniqueName: \"kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.687276 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.687401 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb\") pod \"291b528c-e4a4-4e8b-b88c-7db763b01f37\" (UID: \"291b528c-e4a4-4e8b-b88c-7db763b01f37\") " Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.692988 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6" (OuterVolumeSpecName: "kube-api-access-xnsm6") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "kube-api-access-xnsm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.746951 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config" (OuterVolumeSpecName: "config") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.748347 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.770725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.785223 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.787741 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"70b42ea8-681a-44cb-a494-2093b925d015","Type":"ContainerStarted","Data":"4ad8033840b2e5e99bbe116ef6692353e575fd4ab6411e4e9f8e26cc655bd614"} Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.788120 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.789468 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.789509 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.789523 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnsm6\" (UniqueName: \"kubernetes.io/projected/291b528c-e4a4-4e8b-b88c-7db763b01f37-kube-api-access-xnsm6\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.789537 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.789548 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.791989 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "291b528c-e4a4-4e8b-b88c-7db763b01f37" (UID: "291b528c-e4a4-4e8b-b88c-7db763b01f37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.809087 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerStarted","Data":"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653"} Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.812223 5004 generic.go:334] "Generic (PLEG): container finished" podID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerID="322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1" exitCode=0 Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.813955 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.816027 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" event={"ID":"291b528c-e4a4-4e8b-b88c-7db763b01f37","Type":"ContainerDied","Data":"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1"} Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.816093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-k2lqp" event={"ID":"291b528c-e4a4-4e8b-b88c-7db763b01f37","Type":"ContainerDied","Data":"ee629e0da718e8c1c36d60688b1df34f03aaf83b0f6adb371a122edf0b7b37f1"} Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.816115 5004 scope.go:117] "RemoveContainer" containerID="322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.828561 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nqm7s"] Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.834233 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.474518505 podStartE2EDuration="2.834215805s" podCreationTimestamp="2025-12-03 14:29:51 +0000 UTC" firstStartedPulling="2025-12-03 14:29:52.712298033 +0000 UTC m=+1405.461268269" lastFinishedPulling="2025-12-03 14:29:53.071995333 +0000 UTC m=+1405.820965569" observedRunningTime="2025-12-03 14:29:53.808726843 +0000 UTC m=+1406.557697079" watchObservedRunningTime="2025-12-03 14:29:53.834215805 +0000 UTC m=+1406.583186041" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.851996 5004 scope.go:117] "RemoveContainer" containerID="8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.877652 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.905084 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/291b528c-e4a4-4e8b-b88c-7db763b01f37-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:29:53 crc kubenswrapper[5004]: I1203 14:29:53.914717 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-k2lqp"] Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.024267 5004 scope.go:117] "RemoveContainer" containerID="322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1" Dec 03 14:29:54 crc kubenswrapper[5004]: E1203 14:29:54.024651 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1\": container with ID starting with 322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1 not found: ID does not exist" containerID="322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1" Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.024691 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1"} err="failed to get container status \"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1\": rpc error: code = NotFound desc = could not find container \"322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1\": container with ID starting with 322261c46448745b4982191528bc73c80986eb54af2ab9be76981ed51ef724c1 not found: ID does not exist" Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.024717 5004 scope.go:117] "RemoveContainer" containerID="8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf" Dec 03 14:29:54 crc kubenswrapper[5004]: E1203 14:29:54.025019 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf\": container with ID starting with 8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf not found: ID does not exist" containerID="8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf" Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.025048 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf"} err="failed to get container status \"8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf\": rpc error: code = NotFound desc = could not find container \"8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf\": container with ID starting with 8d6eabdfba0aad0470833df1e6aacb5ae8cb8d13fe43ee3f8e5aabfbe0756ddf not found: ID does not exist" Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.825666 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerStarted","Data":"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc"} Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.827506 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nqm7s" event={"ID":"f3cd7387-6950-4af4-9c08-cd702047c728","Type":"ContainerStarted","Data":"3f7be2230eb25301553197e183633bb43751b39758ee910b6e4d093b90ed0a3b"} Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.827557 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nqm7s" event={"ID":"f3cd7387-6950-4af4-9c08-cd702047c728","Type":"ContainerStarted","Data":"865e89d7709d6d96e046d4e3cff45d91f5d1ea1267578e03021118aeea95a46a"} Dec 03 14:29:54 crc kubenswrapper[5004]: I1203 14:29:54.851786 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nqm7s" podStartSLOduration=2.85176652 podStartE2EDuration="2.85176652s" podCreationTimestamp="2025-12-03 14:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:29:54.848825356 +0000 UTC m=+1407.597795592" watchObservedRunningTime="2025-12-03 14:29:54.85176652 +0000 UTC m=+1407.600736756" Dec 03 14:29:55 crc kubenswrapper[5004]: I1203 14:29:55.627077 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" path="/var/lib/kubelet/pods/291b528c-e4a4-4e8b-b88c-7db763b01f37/volumes" Dec 03 14:29:56 crc kubenswrapper[5004]: I1203 14:29:56.853654 5004 generic.go:334] "Generic (PLEG): container finished" podID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerID="461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806" exitCode=1 Dec 03 14:29:56 crc kubenswrapper[5004]: I1203 14:29:56.853827 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerDied","Data":"461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806"} Dec 03 14:29:56 crc kubenswrapper[5004]: I1203 14:29:56.854007 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-central-agent" containerID="cri-o://cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50" gracePeriod=30 Dec 03 14:29:56 crc kubenswrapper[5004]: I1203 14:29:56.854276 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="sg-core" containerID="cri-o://84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc" gracePeriod=30 Dec 03 14:29:56 crc kubenswrapper[5004]: I1203 14:29:56.854347 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-notification-agent" containerID="cri-o://28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653" gracePeriod=30 Dec 03 14:29:57 crc kubenswrapper[5004]: I1203 14:29:57.867277 5004 generic.go:334] "Generic (PLEG): container finished" podID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerID="84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc" exitCode=2 Dec 03 14:29:57 crc kubenswrapper[5004]: I1203 14:29:57.867318 5004 generic.go:334] "Generic (PLEG): container finished" podID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerID="28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653" exitCode=0 Dec 03 14:29:57 crc kubenswrapper[5004]: I1203 14:29:57.867344 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerDied","Data":"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc"} Dec 03 14:29:57 crc kubenswrapper[5004]: I1203 14:29:57.867374 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerDied","Data":"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653"} Dec 03 14:29:59 crc kubenswrapper[5004]: I1203 14:29:59.888737 5004 generic.go:334] "Generic (PLEG): container finished" podID="f3cd7387-6950-4af4-9c08-cd702047c728" containerID="3f7be2230eb25301553197e183633bb43751b39758ee910b6e4d093b90ed0a3b" exitCode=0 Dec 03 14:29:59 crc kubenswrapper[5004]: I1203 14:29:59.888907 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nqm7s" event={"ID":"f3cd7387-6950-4af4-9c08-cd702047c728","Type":"ContainerDied","Data":"3f7be2230eb25301553197e183633bb43751b39758ee910b6e4d093b90ed0a3b"} Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.105216 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.105286 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.151079 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt"] Dec 03 14:30:00 crc kubenswrapper[5004]: E1203 14:30:00.151984 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="dnsmasq-dns" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.152003 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="dnsmasq-dns" Dec 03 14:30:00 crc kubenswrapper[5004]: E1203 14:30:00.152029 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="init" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.152037 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="init" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.152241 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="291b528c-e4a4-4e8b-b88c-7db763b01f37" containerName="dnsmasq-dns" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.153061 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.168659 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt"] Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.192774 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.192996 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.246231 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.246419 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.247227 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blklz\" (UniqueName: \"kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.351244 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.351305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blklz\" (UniqueName: \"kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.351461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.355790 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.371649 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.375011 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blklz\" (UniqueName: \"kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz\") pod \"collect-profiles-29412870-k2ptt\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.433554 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.517084 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554522 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554602 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554630 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554732 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554834 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n8v5\" (UniqueName: \"kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.554892 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.555290 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd\") pod \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\" (UID: \"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e\") " Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.555884 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.556651 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.556905 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.559016 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts" (OuterVolumeSpecName: "scripts") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.562249 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5" (OuterVolumeSpecName: "kube-api-access-9n8v5") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "kube-api-access-9n8v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.584850 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.645206 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.658833 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n8v5\" (UniqueName: \"kubernetes.io/projected/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-kube-api-access-9n8v5\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.658950 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.658964 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.658975 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.658986 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.668190 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data" (OuterVolumeSpecName: "config-data") pod "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" (UID: "44f095fe-f931-4dd8-8c1b-9f3e67c9a33e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.760274 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.900031 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.900054 5004 generic.go:334] "Generic (PLEG): container finished" podID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerID="cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50" exitCode=0 Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.900131 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerDied","Data":"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50"} Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.900184 5004 scope.go:117] "RemoveContainer" containerID="461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.900418 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44f095fe-f931-4dd8-8c1b-9f3e67c9a33e","Type":"ContainerDied","Data":"ea7d0af2a5bd48c385d830e9d20bf187efcd76d813150c157546adde31d72d00"} Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.939467 5004 scope.go:117] "RemoveContainer" containerID="84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc" Dec 03 14:30:00 crc kubenswrapper[5004]: I1203 14:30:00.943583 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.002752 5004 scope.go:117] "RemoveContainer" containerID="28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.030515 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.046633 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.047196 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-notification-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047216 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-notification-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.047252 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="sg-core" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047260 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="sg-core" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.047292 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="proxy-httpd" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047301 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="proxy-httpd" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.047313 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-central-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047319 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-central-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047556 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="sg-core" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047576 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="proxy-httpd" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047592 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-notification-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.047611 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" containerName="ceilometer-central-agent" Dec 03 14:30:01 crc kubenswrapper[5004]: W1203 14:30:01.059382 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6b75fdb_1d0a_4b9b_a615_57cea78634da.slice/crio-984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1 WatchSource:0}: Error finding container 984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1: Status 404 returned error can't find the container with id 984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1 Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.065598 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.065631 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt"] Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.065716 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.069016 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.069876 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.070179 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114255 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-scripts\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114322 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114346 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114397 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-run-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114473 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-log-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114496 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114520 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-config-data\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.114593 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62sx\" (UniqueName: \"kubernetes.io/projected/fbee9411-e6cf-4d99-89f8-788a0529e8e2-kube-api-access-z62sx\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.140211 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.140690 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.159570 5004 scope.go:117] "RemoveContainer" containerID="cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.215876 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-log-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.216195 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.216236 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-config-data\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.216457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62sx\" (UniqueName: \"kubernetes.io/projected/fbee9411-e6cf-4d99-89f8-788a0529e8e2-kube-api-access-z62sx\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.217007 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-scripts\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.217134 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.217206 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.217129 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-log-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.217422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-run-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.218126 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbee9411-e6cf-4d99-89f8-788a0529e8e2-run-httpd\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.226078 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.226091 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.227333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.236253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-scripts\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.236896 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbee9411-e6cf-4d99-89f8-788a0529e8e2-config-data\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.243343 5004 scope.go:117] "RemoveContainer" containerID="461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.247413 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806\": container with ID starting with 461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806 not found: ID does not exist" containerID="461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.247484 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806"} err="failed to get container status \"461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806\": rpc error: code = NotFound desc = could not find container \"461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806\": container with ID starting with 461c57fb353b89d6847101e87ad36f71c15663f22616f4041ca2cfe85ba87806 not found: ID does not exist" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.247511 5004 scope.go:117] "RemoveContainer" containerID="84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.248908 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc\": container with ID starting with 84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc not found: ID does not exist" containerID="84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.248936 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc"} err="failed to get container status \"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc\": rpc error: code = NotFound desc = could not find container \"84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc\": container with ID starting with 84e8af9aff84d3376609394e4a2377870b0b25a44fe9641c79ecbb63eaed0acc not found: ID does not exist" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.248954 5004 scope.go:117] "RemoveContainer" containerID="28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.252988 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653\": container with ID starting with 28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653 not found: ID does not exist" containerID="28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.253025 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653"} err="failed to get container status \"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653\": rpc error: code = NotFound desc = could not find container \"28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653\": container with ID starting with 28500a4d0bb97094fc09ea49d6c6ef6c5bd042adbcd8d9dee612773237afe653 not found: ID does not exist" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.253049 5004 scope.go:117] "RemoveContainer" containerID="cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50" Dec 03 14:30:01 crc kubenswrapper[5004]: E1203 14:30:01.253362 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50\": container with ID starting with cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50 not found: ID does not exist" containerID="cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.253385 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50"} err="failed to get container status \"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50\": rpc error: code = NotFound desc = could not find container \"cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50\": container with ID starting with cf2ae36fd98c52093f6e4152a53468be37125cb2f95e6ef0079b8fec5015ef50 not found: ID does not exist" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.269559 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62sx\" (UniqueName: \"kubernetes.io/projected/fbee9411-e6cf-4d99-89f8-788a0529e8e2-kube-api-access-z62sx\") pod \"ceilometer-0\" (UID: \"fbee9411-e6cf-4d99-89f8-788a0529e8e2\") " pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.364275 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.421577 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhjc8\" (UniqueName: \"kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8\") pod \"f3cd7387-6950-4af4-9c08-cd702047c728\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.421962 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data\") pod \"f3cd7387-6950-4af4-9c08-cd702047c728\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.422035 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts\") pod \"f3cd7387-6950-4af4-9c08-cd702047c728\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.422343 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle\") pod \"f3cd7387-6950-4af4-9c08-cd702047c728\" (UID: \"f3cd7387-6950-4af4-9c08-cd702047c728\") " Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.428019 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts" (OuterVolumeSpecName: "scripts") pod "f3cd7387-6950-4af4-9c08-cd702047c728" (UID: "f3cd7387-6950-4af4-9c08-cd702047c728"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.428143 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8" (OuterVolumeSpecName: "kube-api-access-jhjc8") pod "f3cd7387-6950-4af4-9c08-cd702047c728" (UID: "f3cd7387-6950-4af4-9c08-cd702047c728"). InnerVolumeSpecName "kube-api-access-jhjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.456654 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3cd7387-6950-4af4-9c08-cd702047c728" (UID: "f3cd7387-6950-4af4-9c08-cd702047c728"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.461148 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data" (OuterVolumeSpecName: "config-data") pod "f3cd7387-6950-4af4-9c08-cd702047c728" (UID: "f3cd7387-6950-4af4-9c08-cd702047c728"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.514985 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.525029 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhjc8\" (UniqueName: \"kubernetes.io/projected/f3cd7387-6950-4af4-9c08-cd702047c728-kube-api-access-jhjc8\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.525068 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.525082 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.525091 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd7387-6950-4af4-9c08-cd702047c728-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.635548 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f095fe-f931-4dd8-8c1b-9f3e67c9a33e" path="/var/lib/kubelet/pods/44f095fe-f931-4dd8-8c1b-9f3e67c9a33e/volumes" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.918189 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nqm7s" event={"ID":"f3cd7387-6950-4af4-9c08-cd702047c728","Type":"ContainerDied","Data":"865e89d7709d6d96e046d4e3cff45d91f5d1ea1267578e03021118aeea95a46a"} Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.918236 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865e89d7709d6d96e046d4e3cff45d91f5d1ea1267578e03021118aeea95a46a" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.918311 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nqm7s" Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.923091 5004 generic.go:334] "Generic (PLEG): container finished" podID="c6b75fdb-1d0a-4b9b-a615-57cea78634da" containerID="bafd4d869393dc00f10a3311f85e90fca74e2a38ff304aeb4d32122162db11ba" exitCode=0 Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.923424 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" event={"ID":"c6b75fdb-1d0a-4b9b-a615-57cea78634da","Type":"ContainerDied","Data":"bafd4d869393dc00f10a3311f85e90fca74e2a38ff304aeb4d32122162db11ba"} Dec 03 14:30:01 crc kubenswrapper[5004]: I1203 14:30:01.923498 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" event={"ID":"c6b75fdb-1d0a-4b9b-a615-57cea78634da","Type":"ContainerStarted","Data":"984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1"} Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:01.996207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.109550 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.109820 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerName="nova-scheduler-scheduler" containerID="cri-o://579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" gracePeriod=30 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.128721 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.128974 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-log" containerID="cri-o://56fe51ab5327e69a4e4db3a7752dd199995e19892d15cf1344a02060b67a1bb2" gracePeriod=30 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.129262 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-api" containerID="cri-o://a15c3295c1dcc4fb4c273859c3721801a4f74ede1d975e7a8169766989250c8c" gracePeriod=30 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.147456 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.147698 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" containerID="cri-o://c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062" gracePeriod=30 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.147835 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" containerID="cri-o://62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435" gracePeriod=30 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.236275 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 14:30:02 crc kubenswrapper[5004]: E1203 14:30:02.776614 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:30:02 crc kubenswrapper[5004]: E1203 14:30:02.779031 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:30:02 crc kubenswrapper[5004]: E1203 14:30:02.780364 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 14:30:02 crc kubenswrapper[5004]: E1203 14:30:02.780500 5004 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerName="nova-scheduler-scheduler" Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.940363 5004 generic.go:334] "Generic (PLEG): container finished" podID="20409804-6695-45da-ae3f-68d988218b01" containerID="c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062" exitCode=143 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.940473 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerDied","Data":"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062"} Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.944496 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbee9411-e6cf-4d99-89f8-788a0529e8e2","Type":"ContainerStarted","Data":"19b309dfecf4e93aeec259f02d215ba86c47419ee3373b0ee486eedf6252bf81"} Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.944716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbee9411-e6cf-4d99-89f8-788a0529e8e2","Type":"ContainerStarted","Data":"ced3c9e95f62cd66cfa57b9c997562893131ddb474dfd668206ee75f74e3103c"} Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.946670 5004 generic.go:334] "Generic (PLEG): container finished" podID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerID="56fe51ab5327e69a4e4db3a7752dd199995e19892d15cf1344a02060b67a1bb2" exitCode=143 Dec 03 14:30:02 crc kubenswrapper[5004]: I1203 14:30:02.946776 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerDied","Data":"56fe51ab5327e69a4e4db3a7752dd199995e19892d15cf1344a02060b67a1bb2"} Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.492279 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.573087 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume\") pod \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.573176 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume\") pod \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.573254 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blklz\" (UniqueName: \"kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz\") pod \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\" (UID: \"c6b75fdb-1d0a-4b9b-a615-57cea78634da\") " Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.574704 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume" (OuterVolumeSpecName: "config-volume") pod "c6b75fdb-1d0a-4b9b-a615-57cea78634da" (UID: "c6b75fdb-1d0a-4b9b-a615-57cea78634da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.578614 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c6b75fdb-1d0a-4b9b-a615-57cea78634da" (UID: "c6b75fdb-1d0a-4b9b-a615-57cea78634da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.583537 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz" (OuterVolumeSpecName: "kube-api-access-blklz") pod "c6b75fdb-1d0a-4b9b-a615-57cea78634da" (UID: "c6b75fdb-1d0a-4b9b-a615-57cea78634da"). InnerVolumeSpecName "kube-api-access-blklz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.675508 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b75fdb-1d0a-4b9b-a615-57cea78634da-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.675558 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blklz\" (UniqueName: \"kubernetes.io/projected/c6b75fdb-1d0a-4b9b-a615-57cea78634da-kube-api-access-blklz\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.675572 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b75fdb-1d0a-4b9b-a615-57cea78634da-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.962485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" event={"ID":"c6b75fdb-1d0a-4b9b-a615-57cea78634da","Type":"ContainerDied","Data":"984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1"} Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.962556 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="984ec9eee005dd442d11d59ff0bd895cb8c5a32684d234036a6ddde2469031c1" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.962674 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt" Dec 03 14:30:03 crc kubenswrapper[5004]: I1203 14:30:03.964541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbee9411-e6cf-4d99-89f8-788a0529e8e2","Type":"ContainerStarted","Data":"9b29977edd25238985e133a2d8b58d43e8ee31b84ebbec044765937e4eed2437"} Dec 03 14:30:04 crc kubenswrapper[5004]: I1203 14:30:04.978075 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbee9411-e6cf-4d99-89f8-788a0529e8e2","Type":"ContainerStarted","Data":"b60dc8fa0bf0ae44c858735532ffebc9d66f3434b3d378672fa67a81554277c1"} Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.287716 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:42136->10.217.0.191:8775: read: connection reset by peer" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.287776 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:42152->10.217.0.191:8775: read: connection reset by peer" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.771184 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.815037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs\") pod \"20409804-6695-45da-ae3f-68d988218b01\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.815158 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle\") pod \"20409804-6695-45da-ae3f-68d988218b01\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.815210 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data\") pod \"20409804-6695-45da-ae3f-68d988218b01\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.815282 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnpjx\" (UniqueName: \"kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx\") pod \"20409804-6695-45da-ae3f-68d988218b01\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.815309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs\") pod \"20409804-6695-45da-ae3f-68d988218b01\" (UID: \"20409804-6695-45da-ae3f-68d988218b01\") " Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.817147 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs" (OuterVolumeSpecName: "logs") pod "20409804-6695-45da-ae3f-68d988218b01" (UID: "20409804-6695-45da-ae3f-68d988218b01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.821127 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx" (OuterVolumeSpecName: "kube-api-access-lnpjx") pod "20409804-6695-45da-ae3f-68d988218b01" (UID: "20409804-6695-45da-ae3f-68d988218b01"). InnerVolumeSpecName "kube-api-access-lnpjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.874048 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data" (OuterVolumeSpecName: "config-data") pod "20409804-6695-45da-ae3f-68d988218b01" (UID: "20409804-6695-45da-ae3f-68d988218b01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.885492 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20409804-6695-45da-ae3f-68d988218b01" (UID: "20409804-6695-45da-ae3f-68d988218b01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.894734 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "20409804-6695-45da-ae3f-68d988218b01" (UID: "20409804-6695-45da-ae3f-68d988218b01"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.916974 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.917010 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.917021 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnpjx\" (UniqueName: \"kubernetes.io/projected/20409804-6695-45da-ae3f-68d988218b01-kube-api-access-lnpjx\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.917033 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20409804-6695-45da-ae3f-68d988218b01-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.917042 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20409804-6695-45da-ae3f-68d988218b01-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.989181 5004 generic.go:334] "Generic (PLEG): container finished" podID="20409804-6695-45da-ae3f-68d988218b01" containerID="62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435" exitCode=0 Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.989244 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerDied","Data":"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435"} Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.989269 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.989290 5004 scope.go:117] "RemoveContainer" containerID="62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435" Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.989278 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20409804-6695-45da-ae3f-68d988218b01","Type":"ContainerDied","Data":"f2b7a3b3efa89b66c510f5fd0b89653d5f6e7fdd44222dcb4c81ad5fa3d19b04"} Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.992984 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbee9411-e6cf-4d99-89f8-788a0529e8e2","Type":"ContainerStarted","Data":"dd0c044351952d82a126bdc403938d4c52daa834e855332d7b8810d00fc4a120"} Dec 03 14:30:05 crc kubenswrapper[5004]: I1203 14:30:05.994057 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.022444 5004 scope.go:117] "RemoveContainer" containerID="c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.026573 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.480322721 podStartE2EDuration="6.026553228s" podCreationTimestamp="2025-12-03 14:30:00 +0000 UTC" firstStartedPulling="2025-12-03 14:30:02.00337186 +0000 UTC m=+1414.752342116" lastFinishedPulling="2025-12-03 14:30:05.549602387 +0000 UTC m=+1418.298572623" observedRunningTime="2025-12-03 14:30:06.023398458 +0000 UTC m=+1418.772368704" watchObservedRunningTime="2025-12-03 14:30:06.026553228 +0000 UTC m=+1418.775523464" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.060282 5004 scope.go:117] "RemoveContainer" containerID="62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435" Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.062047 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435\": container with ID starting with 62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435 not found: ID does not exist" containerID="62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.062079 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435"} err="failed to get container status \"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435\": rpc error: code = NotFound desc = could not find container \"62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435\": container with ID starting with 62481ad9dfafa16725c076a44a7869530b6896986181a5e4399658a55e4bb435 not found: ID does not exist" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.062100 5004 scope.go:117] "RemoveContainer" containerID="c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062" Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.063163 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062\": container with ID starting with c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062 not found: ID does not exist" containerID="c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.063193 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062"} err="failed to get container status \"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062\": rpc error: code = NotFound desc = could not find container \"c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062\": container with ID starting with c61a21eaf043f765d7b69b3ce8f40c5c12a791fe06b54199cd8fc5d37e327062 not found: ID does not exist" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.069436 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.099971 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.101428 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.101969 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.101989 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.102027 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cd7387-6950-4af4-9c08-cd702047c728" containerName="nova-manage" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102036 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cd7387-6950-4af4-9c08-cd702047c728" containerName="nova-manage" Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.102069 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b75fdb-1d0a-4b9b-a615-57cea78634da" containerName="collect-profiles" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102078 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b75fdb-1d0a-4b9b-a615-57cea78634da" containerName="collect-profiles" Dec 03 14:30:06 crc kubenswrapper[5004]: E1203 14:30:06.102091 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102099 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102305 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-log" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102323 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cd7387-6950-4af4-9c08-cd702047c728" containerName="nova-manage" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102343 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b75fdb-1d0a-4b9b-a615-57cea78634da" containerName="collect-profiles" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.102358 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="20409804-6695-45da-ae3f-68d988218b01" containerName="nova-metadata-metadata" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.103660 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.106298 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.108064 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.120248 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a010cac3-4f37-4ffd-8627-5329e566d91a-logs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.120360 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-config-data\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.120411 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v462\" (UniqueName: \"kubernetes.io/projected/a010cac3-4f37-4ffd-8627-5329e566d91a-kube-api-access-6v462\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.120449 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.120587 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.121274 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.222325 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v462\" (UniqueName: \"kubernetes.io/projected/a010cac3-4f37-4ffd-8627-5329e566d91a-kube-api-access-6v462\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.222406 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.222431 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.222485 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a010cac3-4f37-4ffd-8627-5329e566d91a-logs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.222557 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-config-data\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.223382 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a010cac3-4f37-4ffd-8627-5329e566d91a-logs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.227071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-config-data\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.227610 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.228027 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a010cac3-4f37-4ffd-8627-5329e566d91a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.246278 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v462\" (UniqueName: \"kubernetes.io/projected/a010cac3-4f37-4ffd-8627-5329e566d91a-kube-api-access-6v462\") pod \"nova-metadata-0\" (UID: \"a010cac3-4f37-4ffd-8627-5329e566d91a\") " pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.434330 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 14:30:06 crc kubenswrapper[5004]: I1203 14:30:06.733424 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 14:30:06 crc kubenswrapper[5004]: W1203 14:30:06.756270 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda010cac3_4f37_4ffd_8627_5329e566d91a.slice/crio-4087d353b4fe40363f326e6ecb28a73ef519e3bf9a45ba25e94bb596ccd022a3 WatchSource:0}: Error finding container 4087d353b4fe40363f326e6ecb28a73ef519e3bf9a45ba25e94bb596ccd022a3: Status 404 returned error can't find the container with id 4087d353b4fe40363f326e6ecb28a73ef519e3bf9a45ba25e94bb596ccd022a3 Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.004748 5004 generic.go:334] "Generic (PLEG): container finished" podID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerID="579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" exitCode=0 Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.005120 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad294a12-4be8-4326-8f8e-8aec9157343d","Type":"ContainerDied","Data":"579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68"} Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.007229 5004 generic.go:334] "Generic (PLEG): container finished" podID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerID="a15c3295c1dcc4fb4c273859c3721801a4f74ede1d975e7a8169766989250c8c" exitCode=0 Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.007275 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerDied","Data":"a15c3295c1dcc4fb4c273859c3721801a4f74ede1d975e7a8169766989250c8c"} Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.009784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a010cac3-4f37-4ffd-8627-5329e566d91a","Type":"ContainerStarted","Data":"5727f30dedb649eab2db3bdbae878259078b1e8911099c76d9190aa07c516d9b"} Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.009814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a010cac3-4f37-4ffd-8627-5329e566d91a","Type":"ContainerStarted","Data":"4087d353b4fe40363f326e6ecb28a73ef519e3bf9a45ba25e94bb596ccd022a3"} Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.073010 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.150578 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.245553 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle\") pod \"ad294a12-4be8-4326-8f8e-8aec9157343d\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.246273 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data\") pod \"ad294a12-4be8-4326-8f8e-8aec9157343d\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.246446 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56g4q\" (UniqueName: \"kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q\") pod \"ad294a12-4be8-4326-8f8e-8aec9157343d\" (UID: \"ad294a12-4be8-4326-8f8e-8aec9157343d\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.266203 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q" (OuterVolumeSpecName: "kube-api-access-56g4q") pod "ad294a12-4be8-4326-8f8e-8aec9157343d" (UID: "ad294a12-4be8-4326-8f8e-8aec9157343d"). InnerVolumeSpecName "kube-api-access-56g4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.290960 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad294a12-4be8-4326-8f8e-8aec9157343d" (UID: "ad294a12-4be8-4326-8f8e-8aec9157343d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.302976 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data" (OuterVolumeSpecName: "config-data") pod "ad294a12-4be8-4326-8f8e-8aec9157343d" (UID: "ad294a12-4be8-4326-8f8e-8aec9157343d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348015 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348123 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348199 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348301 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348335 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btm7v\" (UniqueName: \"kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.348356 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle\") pod \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\" (UID: \"7fca090f-5247-452b-8ff5-e2bbdcba7eb3\") " Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.349001 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56g4q\" (UniqueName: \"kubernetes.io/projected/ad294a12-4be8-4326-8f8e-8aec9157343d-kube-api-access-56g4q\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.349027 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.349041 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad294a12-4be8-4326-8f8e-8aec9157343d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.349053 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs" (OuterVolumeSpecName: "logs") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.356020 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v" (OuterVolumeSpecName: "kube-api-access-btm7v") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "kube-api-access-btm7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.372341 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data" (OuterVolumeSpecName: "config-data") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.380194 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.409425 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.410424 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7fca090f-5247-452b-8ff5-e2bbdcba7eb3" (UID: "7fca090f-5247-452b-8ff5-e2bbdcba7eb3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451365 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451413 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451425 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451438 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451450 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btm7v\" (UniqueName: \"kubernetes.io/projected/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-kube-api-access-btm7v\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.451463 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fca090f-5247-452b-8ff5-e2bbdcba7eb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:07 crc kubenswrapper[5004]: I1203 14:30:07.626912 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20409804-6695-45da-ae3f-68d988218b01" path="/var/lib/kubelet/pods/20409804-6695-45da-ae3f-68d988218b01/volumes" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.019643 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a010cac3-4f37-4ffd-8627-5329e566d91a","Type":"ContainerStarted","Data":"9b119124955d883f2195e8d6c1f2aea46a7317a726ebe7a973cb4e7160c796f7"} Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.022132 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ad294a12-4be8-4326-8f8e-8aec9157343d","Type":"ContainerDied","Data":"cb40f856ea3b05760057176d3c62b095b808925640f49de1fb957aa4eaff68e3"} Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.022170 5004 scope.go:117] "RemoveContainer" containerID="579be05d80892f774321ea1604dcee0bcd6f300bf1388d25dd7768d6d2d51c68" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.022174 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.031855 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.031849 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7fca090f-5247-452b-8ff5-e2bbdcba7eb3","Type":"ContainerDied","Data":"b4108f810acaad5f959fb7b4c99d3bad040e710231bf449ffe88531fc299c4ba"} Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.063657 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.063639524 podStartE2EDuration="2.063639524s" podCreationTimestamp="2025-12-03 14:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:08.045746777 +0000 UTC m=+1420.794717013" watchObservedRunningTime="2025-12-03 14:30:08.063639524 +0000 UTC m=+1420.812609760" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.064468 5004 scope.go:117] "RemoveContainer" containerID="a15c3295c1dcc4fb4c273859c3721801a4f74ede1d975e7a8169766989250c8c" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.077940 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.097672 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.121724 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: E1203 14:30:08.122183 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-log" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122201 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-log" Dec 03 14:30:08 crc kubenswrapper[5004]: E1203 14:30:08.122232 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerName="nova-scheduler-scheduler" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122241 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerName="nova-scheduler-scheduler" Dec 03 14:30:08 crc kubenswrapper[5004]: E1203 14:30:08.122283 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-api" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122290 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-api" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122506 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-log" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122536 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" containerName="nova-scheduler-scheduler" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.122546 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" containerName="nova-api-api" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.123734 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.128077 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.128554 5004 scope.go:117] "RemoveContainer" containerID="56fe51ab5327e69a4e4db3a7752dd199995e19892d15cf1344a02060b67a1bb2" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.129650 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.130044 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.136837 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.169757 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.180567 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.191793 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.193442 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.197337 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.203422 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.268592 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-config-data\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.268671 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-public-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.268809 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.268905 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgnt\" (UniqueName: \"kubernetes.io/projected/489bba73-88c3-42e0-ad06-ee95a6073263-kube-api-access-4cgnt\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.268973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/489bba73-88c3-42e0-ad06-ee95a6073263-logs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.269160 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-internal-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371187 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-public-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371228 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-config-data\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371259 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgnt\" (UniqueName: \"kubernetes.io/projected/489bba73-88c3-42e0-ad06-ee95a6073263-kube-api-access-4cgnt\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371320 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/489bba73-88c3-42e0-ad06-ee95a6073263-logs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371340 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspdn\" (UniqueName: \"kubernetes.io/projected/c4e5958b-4572-4965-8948-89fc51a2c486-kube-api-access-rspdn\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371388 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371446 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-internal-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.371508 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-config-data\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.373231 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/489bba73-88c3-42e0-ad06-ee95a6073263-logs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.381790 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-config-data\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.382663 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-internal-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.382761 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.383477 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/489bba73-88c3-42e0-ad06-ee95a6073263-public-tls-certs\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.394605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgnt\" (UniqueName: \"kubernetes.io/projected/489bba73-88c3-42e0-ad06-ee95a6073263-kube-api-access-4cgnt\") pod \"nova-api-0\" (UID: \"489bba73-88c3-42e0-ad06-ee95a6073263\") " pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.458847 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.473063 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.473216 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-config-data\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.473265 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspdn\" (UniqueName: \"kubernetes.io/projected/c4e5958b-4572-4965-8948-89fc51a2c486-kube-api-access-rspdn\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.476655 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-config-data\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.476905 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e5958b-4572-4965-8948-89fc51a2c486-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.492376 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspdn\" (UniqueName: \"kubernetes.io/projected/c4e5958b-4572-4965-8948-89fc51a2c486-kube-api-access-rspdn\") pod \"nova-scheduler-0\" (UID: \"c4e5958b-4572-4965-8948-89fc51a2c486\") " pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.515608 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 14:30:08 crc kubenswrapper[5004]: W1203 14:30:08.922747 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod489bba73_88c3_42e0_ad06_ee95a6073263.slice/crio-00b6a702c73e355e1a0a10040f81acd9faa64279529f0e0ac9608c7abbee87d3 WatchSource:0}: Error finding container 00b6a702c73e355e1a0a10040f81acd9faa64279529f0e0ac9608c7abbee87d3: Status 404 returned error can't find the container with id 00b6a702c73e355e1a0a10040f81acd9faa64279529f0e0ac9608c7abbee87d3 Dec 03 14:30:08 crc kubenswrapper[5004]: I1203 14:30:08.925047 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 14:30:09 crc kubenswrapper[5004]: W1203 14:30:09.006422 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4e5958b_4572_4965_8948_89fc51a2c486.slice/crio-d0c29e002a19d10dec903d81640581dbae63428f573b93a6be8463488a2d06a3 WatchSource:0}: Error finding container d0c29e002a19d10dec903d81640581dbae63428f573b93a6be8463488a2d06a3: Status 404 returned error can't find the container with id d0c29e002a19d10dec903d81640581dbae63428f573b93a6be8463488a2d06a3 Dec 03 14:30:09 crc kubenswrapper[5004]: I1203 14:30:09.016756 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 14:30:09 crc kubenswrapper[5004]: I1203 14:30:09.054696 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4e5958b-4572-4965-8948-89fc51a2c486","Type":"ContainerStarted","Data":"d0c29e002a19d10dec903d81640581dbae63428f573b93a6be8463488a2d06a3"} Dec 03 14:30:09 crc kubenswrapper[5004]: I1203 14:30:09.057507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"489bba73-88c3-42e0-ad06-ee95a6073263","Type":"ContainerStarted","Data":"00b6a702c73e355e1a0a10040f81acd9faa64279529f0e0ac9608c7abbee87d3"} Dec 03 14:30:09 crc kubenswrapper[5004]: I1203 14:30:09.624006 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fca090f-5247-452b-8ff5-e2bbdcba7eb3" path="/var/lib/kubelet/pods/7fca090f-5247-452b-8ff5-e2bbdcba7eb3/volumes" Dec 03 14:30:09 crc kubenswrapper[5004]: I1203 14:30:09.625103 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad294a12-4be8-4326-8f8e-8aec9157343d" path="/var/lib/kubelet/pods/ad294a12-4be8-4326-8f8e-8aec9157343d/volumes" Dec 03 14:30:10 crc kubenswrapper[5004]: I1203 14:30:10.067789 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c4e5958b-4572-4965-8948-89fc51a2c486","Type":"ContainerStarted","Data":"1752534ef735117df5afc1af2cf9a25bf26ecdb61ab3ccb1ff4098a0ab0c1ca7"} Dec 03 14:30:10 crc kubenswrapper[5004]: I1203 14:30:10.071499 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"489bba73-88c3-42e0-ad06-ee95a6073263","Type":"ContainerStarted","Data":"40ebe27d615cf7b061f7cf6fe5b103fa93444fbdc42b8cde4a0c9ace369ab9c5"} Dec 03 14:30:10 crc kubenswrapper[5004]: I1203 14:30:10.071541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"489bba73-88c3-42e0-ad06-ee95a6073263","Type":"ContainerStarted","Data":"8a148fecbe01f2fcf86f6233d9b5315b6d459363401644a024412f86212ac130"} Dec 03 14:30:10 crc kubenswrapper[5004]: I1203 14:30:10.095142 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.095119521 podStartE2EDuration="2.095119521s" podCreationTimestamp="2025-12-03 14:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:10.083213243 +0000 UTC m=+1422.832183489" watchObservedRunningTime="2025-12-03 14:30:10.095119521 +0000 UTC m=+1422.844089767" Dec 03 14:30:10 crc kubenswrapper[5004]: I1203 14:30:10.112100 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.112076471 podStartE2EDuration="2.112076471s" podCreationTimestamp="2025-12-03 14:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:10.105884125 +0000 UTC m=+1422.854854351" watchObservedRunningTime="2025-12-03 14:30:10.112076471 +0000 UTC m=+1422.861046707" Dec 03 14:30:11 crc kubenswrapper[5004]: I1203 14:30:11.434813 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:30:11 crc kubenswrapper[5004]: I1203 14:30:11.435419 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 14:30:13 crc kubenswrapper[5004]: I1203 14:30:13.516675 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 14:30:16 crc kubenswrapper[5004]: I1203 14:30:16.435374 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:30:16 crc kubenswrapper[5004]: I1203 14:30:16.436060 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 14:30:17 crc kubenswrapper[5004]: I1203 14:30:17.453092 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a010cac3-4f37-4ffd-8627-5329e566d91a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:17 crc kubenswrapper[5004]: I1203 14:30:17.453089 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a010cac3-4f37-4ffd-8627-5329e566d91a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:18 crc kubenswrapper[5004]: I1203 14:30:18.460185 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:30:18 crc kubenswrapper[5004]: I1203 14:30:18.460250 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 14:30:18 crc kubenswrapper[5004]: I1203 14:30:18.516300 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 14:30:18 crc kubenswrapper[5004]: I1203 14:30:18.542235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 14:30:19 crc kubenswrapper[5004]: I1203 14:30:19.188408 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 14:30:19 crc kubenswrapper[5004]: I1203 14:30:19.478014 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="489bba73-88c3-42e0-ad06-ee95a6073263" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:19 crc kubenswrapper[5004]: I1203 14:30:19.478019 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="489bba73-88c3-42e0-ad06-ee95a6073263" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 14:30:22 crc kubenswrapper[5004]: I1203 14:30:22.824177 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:30:22 crc kubenswrapper[5004]: I1203 14:30:22.824696 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:30:26 crc kubenswrapper[5004]: I1203 14:30:26.440419 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:30:26 crc kubenswrapper[5004]: I1203 14:30:26.443480 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 14:30:26 crc kubenswrapper[5004]: I1203 14:30:26.447743 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:30:27 crc kubenswrapper[5004]: I1203 14:30:27.229182 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 14:30:28 crc kubenswrapper[5004]: I1203 14:30:28.465899 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:30:28 crc kubenswrapper[5004]: I1203 14:30:28.466658 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:30:28 crc kubenswrapper[5004]: I1203 14:30:28.468063 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 14:30:28 crc kubenswrapper[5004]: I1203 14:30:28.471398 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:30:29 crc kubenswrapper[5004]: I1203 14:30:29.240726 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 14:30:29 crc kubenswrapper[5004]: I1203 14:30:29.251605 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 14:30:31 crc kubenswrapper[5004]: I1203 14:30:31.525745 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 14:30:41 crc kubenswrapper[5004]: I1203 14:30:41.513350 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:42 crc kubenswrapper[5004]: I1203 14:30:42.420224 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:45 crc kubenswrapper[5004]: I1203 14:30:45.846372 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="rabbitmq" containerID="cri-o://ce016eb24268eeb5262d178cd7816a4e0120c4f2932fcdc39308002822a6a623" gracePeriod=604796 Dec 03 14:30:46 crc kubenswrapper[5004]: I1203 14:30:46.420566 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="rabbitmq" containerID="cri-o://6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba" gracePeriod=604797 Dec 03 14:30:51 crc kubenswrapper[5004]: I1203 14:30:51.666633 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 03 14:30:51 crc kubenswrapper[5004]: I1203 14:30:51.997631 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.493261 5004 generic.go:334] "Generic (PLEG): container finished" podID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerID="ce016eb24268eeb5262d178cd7816a4e0120c4f2932fcdc39308002822a6a623" exitCode=0 Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.493303 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerDied","Data":"ce016eb24268eeb5262d178cd7816a4e0120c4f2932fcdc39308002822a6a623"} Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.823959 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.824259 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.824300 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.825398 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.825460 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d" gracePeriod=600 Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.946020 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.989251 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.989706 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.989823 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.989919 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.989971 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990003 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990068 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hw7g\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990101 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990124 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990156 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:52 crc kubenswrapper[5004]: I1203 14:30:52.990220 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\" (UID: \"ffbbacf9-4c9b-47ac-9ff7-76bee9534490\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:52.997540 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:52.998243 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:52.999321 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.002013 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.008747 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g" (OuterVolumeSpecName: "kube-api-access-4hw7g") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "kube-api-access-4hw7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.008950 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info" (OuterVolumeSpecName: "pod-info") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.009893 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.044137 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.049364 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data" (OuterVolumeSpecName: "config-data") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.077652 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091565 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091633 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091651 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091736 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091755 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jw4x\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091786 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091810 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091843 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091889 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091909 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.091935 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie\") pod \"d8803d14-0481-4d2f-8fc3-46404a7411a7\" (UID: \"d8803d14-0481-4d2f-8fc3-46404a7411a7\") " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092259 5004 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092274 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092287 5004 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092298 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hw7g\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-kube-api-access-4hw7g\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092307 5004 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092315 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092332 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092341 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092351 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.092701 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.094713 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.095980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.098833 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x" (OuterVolumeSpecName: "kube-api-access-7jw4x") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "kube-api-access-7jw4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.102433 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.119042 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.127996 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.129880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.140090 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf" (OuterVolumeSpecName: "server-conf") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.183688 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.188056 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data" (OuterVolumeSpecName: "config-data") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.193366 5004 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.193517 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.193605 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.193723 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jw4x\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-kube-api-access-7jw4x\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.193888 5004 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8803d14-0481-4d2f-8fc3-46404a7411a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194039 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194158 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194266 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194341 5004 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8803d14-0481-4d2f-8fc3-46404a7411a7-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194402 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.194472 5004 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.200601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.219682 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ffbbacf9-4c9b-47ac-9ff7-76bee9534490" (UID: "ffbbacf9-4c9b-47ac-9ff7-76bee9534490"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.246985 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.296646 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.297055 5004 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8803d14-0481-4d2f-8fc3-46404a7411a7-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.297068 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ffbbacf9-4c9b-47ac-9ff7-76bee9534490-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.319505 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d8803d14-0481-4d2f-8fc3-46404a7411a7" (UID: "d8803d14-0481-4d2f-8fc3-46404a7411a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.400042 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8803d14-0481-4d2f-8fc3-46404a7411a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.503240 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ffbbacf9-4c9b-47ac-9ff7-76bee9534490","Type":"ContainerDied","Data":"b7a6b3cf9067dbcadb7a7672a3086ebac3fcab41c0a098d2f31a2af239b71e29"} Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.503291 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.503301 5004 scope.go:117] "RemoveContainer" containerID="ce016eb24268eeb5262d178cd7816a4e0120c4f2932fcdc39308002822a6a623" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.506051 5004 generic.go:334] "Generic (PLEG): container finished" podID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerID="6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba" exitCode=0 Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.506111 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.506139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerDied","Data":"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba"} Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.506182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8803d14-0481-4d2f-8fc3-46404a7411a7","Type":"ContainerDied","Data":"0acf4a4af55697106abbdb02a68ba69aa50b535a573ae1d6bf621b1bd20e69a7"} Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.509432 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d" exitCode=0 Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.509478 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d"} Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.509508 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb"} Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.613057 5004 scope.go:117] "RemoveContainer" containerID="885deb67571b0f40c6fdcdd93d6440c32639996ce8f2cef0da52a94aa94e93d5" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.647665 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.665076 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.668507 5004 scope.go:117] "RemoveContainer" containerID="6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.683958 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.711418 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.724441 5004 scope.go:117] "RemoveContainer" containerID="36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.736884 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.737311 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="setup-container" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737325 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="setup-container" Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.737335 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737341 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.737354 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="setup-container" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737360 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="setup-container" Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.737374 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737380 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737552 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.737563 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" containerName="rabbitmq" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.738511 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.741961 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.742695 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.742897 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.743197 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.744014 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.744153 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g9m58" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.744474 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.750184 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.765193 5004 scope.go:117] "RemoveContainer" containerID="6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba" Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.765698 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba\": container with ID starting with 6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba not found: ID does not exist" containerID="6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.765722 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba"} err="failed to get container status \"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba\": rpc error: code = NotFound desc = could not find container \"6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba\": container with ID starting with 6e6e318869690fea36f0147e4217af9c354e026c82a816412ec458de4daa65ba not found: ID does not exist" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.765740 5004 scope.go:117] "RemoveContainer" containerID="36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9" Dec 03 14:30:53 crc kubenswrapper[5004]: E1203 14:30:53.766113 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9\": container with ID starting with 36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9 not found: ID does not exist" containerID="36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.766128 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9"} err="failed to get container status \"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9\": rpc error: code = NotFound desc = could not find container \"36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9\": container with ID starting with 36d8d694d0ee297ea054b2d7bd796c6fef58a90e7fc8ec79a17830b682cbaab9 not found: ID does not exist" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.766139 5004 scope.go:117] "RemoveContainer" containerID="ae3e8ce119fe4c96e9d317ac8a1ed2026db3a3883a53e4163106629c2c17bf9a" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.769771 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.775043 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.778116 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.782709 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.782969 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.787900 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.788126 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.789777 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.806675 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.810052 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jt4gj" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.914842 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfmq\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-kube-api-access-zvfmq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.914991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10a8bdc-f17c-4090-8c82-dcce9b638577-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915225 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915275 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8febd608-4e34-4b42-bcf7-27dbf88b7a09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915310 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915370 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915472 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915532 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10a8bdc-f17c-4090-8c82-dcce9b638577-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915565 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915595 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhq5d\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-kube-api-access-vhq5d\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915612 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915655 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915693 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915754 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915810 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915836 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8febd608-4e34-4b42-bcf7-27dbf88b7a09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915885 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915926 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:53 crc kubenswrapper[5004]: I1203 14:30:53.915958 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.017978 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfmq\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-kube-api-access-zvfmq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10a8bdc-f17c-4090-8c82-dcce9b638577-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018134 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018180 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8febd608-4e34-4b42-bcf7-27dbf88b7a09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018219 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018294 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018344 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10a8bdc-f17c-4090-8c82-dcce9b638577-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018361 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018376 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018441 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhq5d\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-kube-api-access-vhq5d\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018472 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018531 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018575 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018599 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8febd608-4e34-4b42-bcf7-27dbf88b7a09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018620 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018659 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.018764 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.019093 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.019234 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.019299 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.019932 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.020674 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.022010 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.023301 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.023474 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.023973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.025876 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8febd608-4e34-4b42-bcf7-27dbf88b7a09-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.026416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10a8bdc-f17c-4090-8c82-dcce9b638577-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.026407 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8febd608-4e34-4b42-bcf7-27dbf88b7a09-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.026479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.026615 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10a8bdc-f17c-4090-8c82-dcce9b638577-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.034725 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.034885 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.035310 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.036827 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10a8bdc-f17c-4090-8c82-dcce9b638577-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.040215 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8febd608-4e34-4b42-bcf7-27dbf88b7a09-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.040686 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfmq\" (UniqueName: \"kubernetes.io/projected/8febd608-4e34-4b42-bcf7-27dbf88b7a09-kube-api-access-zvfmq\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.044347 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhq5d\" (UniqueName: \"kubernetes.io/projected/c10a8bdc-f17c-4090-8c82-dcce9b638577-kube-api-access-vhq5d\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.060526 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c10a8bdc-f17c-4090-8c82-dcce9b638577\") " pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.068600 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8febd608-4e34-4b42-bcf7-27dbf88b7a09\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.136504 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.137369 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.570458 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 14:30:54 crc kubenswrapper[5004]: W1203 14:30:54.575616 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8febd608_4e34_4b42_bcf7_27dbf88b7a09.slice/crio-5af0409ac1e8ae5c8b65b32ab7d35545a610023125eaa5a4363d10c657617264 WatchSource:0}: Error finding container 5af0409ac1e8ae5c8b65b32ab7d35545a610023125eaa5a4363d10c657617264: Status 404 returned error can't find the container with id 5af0409ac1e8ae5c8b65b32ab7d35545a610023125eaa5a4363d10c657617264 Dec 03 14:30:54 crc kubenswrapper[5004]: W1203 14:30:54.644080 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10a8bdc_f17c_4090_8c82_dcce9b638577.slice/crio-87613456b024f8f164f8cfea183b5629fe052054302be96842e10e95fdab5a6a WatchSource:0}: Error finding container 87613456b024f8f164f8cfea183b5629fe052054302be96842e10e95fdab5a6a: Status 404 returned error can't find the container with id 87613456b024f8f164f8cfea183b5629fe052054302be96842e10e95fdab5a6a Dec 03 14:30:54 crc kubenswrapper[5004]: I1203 14:30:54.651958 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 14:30:55 crc kubenswrapper[5004]: I1203 14:30:55.554194 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10a8bdc-f17c-4090-8c82-dcce9b638577","Type":"ContainerStarted","Data":"87613456b024f8f164f8cfea183b5629fe052054302be96842e10e95fdab5a6a"} Dec 03 14:30:55 crc kubenswrapper[5004]: I1203 14:30:55.556315 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8febd608-4e34-4b42-bcf7-27dbf88b7a09","Type":"ContainerStarted","Data":"5af0409ac1e8ae5c8b65b32ab7d35545a610023125eaa5a4363d10c657617264"} Dec 03 14:30:55 crc kubenswrapper[5004]: I1203 14:30:55.625397 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8803d14-0481-4d2f-8fc3-46404a7411a7" path="/var/lib/kubelet/pods/d8803d14-0481-4d2f-8fc3-46404a7411a7/volumes" Dec 03 14:30:55 crc kubenswrapper[5004]: I1203 14:30:55.626938 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbbacf9-4c9b-47ac-9ff7-76bee9534490" path="/var/lib/kubelet/pods/ffbbacf9-4c9b-47ac-9ff7-76bee9534490/volumes" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.247548 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.254738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.258318 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.265099 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.386749 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.386829 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v22rp\" (UniqueName: \"kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.386911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.386948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.386982 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.387022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.387049 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488466 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488530 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488645 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488696 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v22rp\" (UniqueName: \"kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488780 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.488813 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.489421 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.489827 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.491033 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.491763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.491899 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.492028 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.508896 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v22rp\" (UniqueName: \"kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp\") pod \"dnsmasq-dns-79bd4cc8c9-vjvrf\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.565985 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10a8bdc-f17c-4090-8c82-dcce9b638577","Type":"ContainerStarted","Data":"158a9dbe7167a92131db42852f205f296ea7de1f8dc9d076eb8df9c9ad249685"} Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.567901 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8febd608-4e34-4b42-bcf7-27dbf88b7a09","Type":"ContainerStarted","Data":"78d2f4dad8fcb385c45c034b856fefd4005052b6b5db6fccdaef583c5e32267b"} Dec 03 14:30:56 crc kubenswrapper[5004]: I1203 14:30:56.576706 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:57 crc kubenswrapper[5004]: I1203 14:30:57.100851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:30:57 crc kubenswrapper[5004]: W1203 14:30:57.110004 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc197f4a_f6bb_4b16_a471_ccb2abb42e72.slice/crio-cd9132b56ee1ac8f93f1c54cb97c4dae3826f62b983913fc98f367c377ca070b WatchSource:0}: Error finding container cd9132b56ee1ac8f93f1c54cb97c4dae3826f62b983913fc98f367c377ca070b: Status 404 returned error can't find the container with id cd9132b56ee1ac8f93f1c54cb97c4dae3826f62b983913fc98f367c377ca070b Dec 03 14:30:57 crc kubenswrapper[5004]: I1203 14:30:57.598299 5004 generic.go:334] "Generic (PLEG): container finished" podID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerID="891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76" exitCode=0 Dec 03 14:30:57 crc kubenswrapper[5004]: I1203 14:30:57.598674 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" event={"ID":"bc197f4a-f6bb-4b16-a471-ccb2abb42e72","Type":"ContainerDied","Data":"891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76"} Dec 03 14:30:57 crc kubenswrapper[5004]: I1203 14:30:57.599146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" event={"ID":"bc197f4a-f6bb-4b16-a471-ccb2abb42e72","Type":"ContainerStarted","Data":"cd9132b56ee1ac8f93f1c54cb97c4dae3826f62b983913fc98f367c377ca070b"} Dec 03 14:30:58 crc kubenswrapper[5004]: I1203 14:30:58.611163 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" event={"ID":"bc197f4a-f6bb-4b16-a471-ccb2abb42e72","Type":"ContainerStarted","Data":"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541"} Dec 03 14:30:58 crc kubenswrapper[5004]: I1203 14:30:58.611526 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:30:58 crc kubenswrapper[5004]: I1203 14:30:58.635742 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" podStartSLOduration=2.6357167329999998 podStartE2EDuration="2.635716733s" podCreationTimestamp="2025-12-03 14:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:30:58.629198526 +0000 UTC m=+1471.378168762" watchObservedRunningTime="2025-12-03 14:30:58.635716733 +0000 UTC m=+1471.384687059" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.578697 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.675108 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.675352 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="dnsmasq-dns" containerID="cri-o://1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c" gracePeriod=10 Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.879015 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-78lk4"] Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.881681 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.900957 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-78lk4"] Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982492 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-svc\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982549 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982580 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982698 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-config\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982765 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982820 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:06 crc kubenswrapper[5004]: I1203 14:31:06.982848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk52c\" (UniqueName: \"kubernetes.io/projected/cccca89a-106f-4827-b398-81f1459b6648-kube-api-access-gk52c\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084618 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-config\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084783 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084822 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084842 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk52c\" (UniqueName: \"kubernetes.io/projected/cccca89a-106f-4827-b398-81f1459b6648-kube-api-access-gk52c\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084909 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-svc\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.084988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.086635 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.086696 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.086754 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.087388 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-config\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.087484 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.087605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cccca89a-106f-4827-b398-81f1459b6648-dns-svc\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.108601 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk52c\" (UniqueName: \"kubernetes.io/projected/cccca89a-106f-4827-b398-81f1459b6648-kube-api-access-gk52c\") pod \"dnsmasq-dns-55478c4467-78lk4\" (UID: \"cccca89a-106f-4827-b398-81f1459b6648\") " pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.196773 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.202438 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289467 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drlbw\" (UniqueName: \"kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289544 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289635 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289659 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289678 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.289744 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc\") pod \"d34e472d-b443-4e4f-9843-694db62e3394\" (UID: \"d34e472d-b443-4e4f-9843-694db62e3394\") " Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.294784 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw" (OuterVolumeSpecName: "kube-api-access-drlbw") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "kube-api-access-drlbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.355217 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.361422 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.367396 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.371261 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config" (OuterVolumeSpecName: "config") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.377503 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d34e472d-b443-4e4f-9843-694db62e3394" (UID: "d34e472d-b443-4e4f-9843-694db62e3394"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397121 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397157 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drlbw\" (UniqueName: \"kubernetes.io/projected/d34e472d-b443-4e4f-9843-694db62e3394-kube-api-access-drlbw\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397172 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397185 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397199 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.397210 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d34e472d-b443-4e4f-9843-694db62e3394-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.674914 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-78lk4"] Dec 03 14:31:07 crc kubenswrapper[5004]: W1203 14:31:07.678804 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcccca89a_106f_4827_b398_81f1459b6648.slice/crio-cf7a0a701b24d9ff3f32b9e902a233d6bf80e1ff33c3fcf6920bd6488b04125f WatchSource:0}: Error finding container cf7a0a701b24d9ff3f32b9e902a233d6bf80e1ff33c3fcf6920bd6488b04125f: Status 404 returned error can't find the container with id cf7a0a701b24d9ff3f32b9e902a233d6bf80e1ff33c3fcf6920bd6488b04125f Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.710061 5004 generic.go:334] "Generic (PLEG): container finished" podID="d34e472d-b443-4e4f-9843-694db62e3394" containerID="1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c" exitCode=0 Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.710127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" event={"ID":"d34e472d-b443-4e4f-9843-694db62e3394","Type":"ContainerDied","Data":"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c"} Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.710156 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.710432 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qkvj5" event={"ID":"d34e472d-b443-4e4f-9843-694db62e3394","Type":"ContainerDied","Data":"23f1cbd8e2b40b809655fcc25655a917d21454b937fb4215b1e43ebc974cf9b6"} Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.710546 5004 scope.go:117] "RemoveContainer" containerID="1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.713301 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-78lk4" event={"ID":"cccca89a-106f-4827-b398-81f1459b6648","Type":"ContainerStarted","Data":"cf7a0a701b24d9ff3f32b9e902a233d6bf80e1ff33c3fcf6920bd6488b04125f"} Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.738566 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.738897 5004 scope.go:117] "RemoveContainer" containerID="e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.753849 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qkvj5"] Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.771684 5004 scope.go:117] "RemoveContainer" containerID="1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c" Dec 03 14:31:07 crc kubenswrapper[5004]: E1203 14:31:07.772173 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c\": container with ID starting with 1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c not found: ID does not exist" containerID="1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.772215 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c"} err="failed to get container status \"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c\": rpc error: code = NotFound desc = could not find container \"1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c\": container with ID starting with 1dafdb58fe72a14c52be7b9cefc95f988f5eb4a5d8225595d615f5e4dfd0f71c not found: ID does not exist" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.772242 5004 scope.go:117] "RemoveContainer" containerID="e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53" Dec 03 14:31:07 crc kubenswrapper[5004]: E1203 14:31:07.772488 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53\": container with ID starting with e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53 not found: ID does not exist" containerID="e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.772517 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53"} err="failed to get container status \"e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53\": rpc error: code = NotFound desc = could not find container \"e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53\": container with ID starting with e27fd78f0f20296fc5134c29f080d12eff7ad5a2ed40d28fa303b82959dc8f53 not found: ID does not exist" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.887987 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:07 crc kubenswrapper[5004]: E1203 14:31:07.888365 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="init" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.888379 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="init" Dec 03 14:31:07 crc kubenswrapper[5004]: E1203 14:31:07.888403 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="dnsmasq-dns" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.888410 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="dnsmasq-dns" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.888579 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34e472d-b443-4e4f-9843-694db62e3394" containerName="dnsmasq-dns" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.889917 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:07 crc kubenswrapper[5004]: I1203 14:31:07.901596 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.010433 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxq69\" (UniqueName: \"kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.010744 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.010971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.113123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.113266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.113330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxq69\" (UniqueName: \"kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.114261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.114295 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.131890 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxq69\" (UniqueName: \"kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69\") pod \"redhat-marketplace-q9457\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.235311 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.724300 5004 generic.go:334] "Generic (PLEG): container finished" podID="cccca89a-106f-4827-b398-81f1459b6648" containerID="3a729f1bda456375598d9586f7bde5bb97cd54696029840ff539d1e2b8a02478" exitCode=0 Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.724341 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-78lk4" event={"ID":"cccca89a-106f-4827-b398-81f1459b6648","Type":"ContainerDied","Data":"3a729f1bda456375598d9586f7bde5bb97cd54696029840ff539d1e2b8a02478"} Dec 03 14:31:08 crc kubenswrapper[5004]: I1203 14:31:08.812693 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.632028 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34e472d-b443-4e4f-9843-694db62e3394" path="/var/lib/kubelet/pods/d34e472d-b443-4e4f-9843-694db62e3394/volumes" Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.738489 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-78lk4" event={"ID":"cccca89a-106f-4827-b398-81f1459b6648","Type":"ContainerStarted","Data":"27d18421a68a50d28fbb8688bf8fe75f60f62728e62e11d11b596feaa3d094ed"} Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.738553 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.741266 5004 generic.go:334] "Generic (PLEG): container finished" podID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerID="37a275a57cacffa79a42bbf6488a8c139ac431b95c656a6ffc31a9bde8c5559b" exitCode=0 Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.741376 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerDied","Data":"37a275a57cacffa79a42bbf6488a8c139ac431b95c656a6ffc31a9bde8c5559b"} Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.741512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerStarted","Data":"441702e6353255499fd6216c497705584efb9b8285af351fb954959233458a14"} Dec 03 14:31:09 crc kubenswrapper[5004]: I1203 14:31:09.762288 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-78lk4" podStartSLOduration=3.762236259 podStartE2EDuration="3.762236259s" podCreationTimestamp="2025-12-03 14:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:31:09.755698101 +0000 UTC m=+1482.504668357" watchObservedRunningTime="2025-12-03 14:31:09.762236259 +0000 UTC m=+1482.511206495" Dec 03 14:31:11 crc kubenswrapper[5004]: I1203 14:31:11.761461 5004 generic.go:334] "Generic (PLEG): container finished" podID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerID="e1d60141327563839520aea5f6058ba1f18c976259a57d6416d91e5ef2a4331a" exitCode=0 Dec 03 14:31:11 crc kubenswrapper[5004]: I1203 14:31:11.761560 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerDied","Data":"e1d60141327563839520aea5f6058ba1f18c976259a57d6416d91e5ef2a4331a"} Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.662062 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.664812 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.682446 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.773133 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerStarted","Data":"54fb80c81ba96ea25a3d66760bbd21c3437ed9dfdf6b394816a0cb74ddba2399"} Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.791749 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q9457" podStartSLOduration=3.390111587 podStartE2EDuration="5.791731637s" podCreationTimestamp="2025-12-03 14:31:07 +0000 UTC" firstStartedPulling="2025-12-03 14:31:09.743954874 +0000 UTC m=+1482.492925120" lastFinishedPulling="2025-12-03 14:31:12.145574934 +0000 UTC m=+1484.894545170" observedRunningTime="2025-12-03 14:31:12.789587495 +0000 UTC m=+1485.538557731" watchObservedRunningTime="2025-12-03 14:31:12.791731637 +0000 UTC m=+1485.540701883" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.804310 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfgnn\" (UniqueName: \"kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.804443 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.804544 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.906260 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.906354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfgnn\" (UniqueName: \"kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.906430 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.906962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.906981 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:12 crc kubenswrapper[5004]: I1203 14:31:12.929772 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfgnn\" (UniqueName: \"kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn\") pod \"certified-operators-kdx29\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:13 crc kubenswrapper[5004]: I1203 14:31:13.005597 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:14 crc kubenswrapper[5004]: I1203 14:31:14.271656 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:14 crc kubenswrapper[5004]: W1203 14:31:14.276663 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44391f91_0b9c_4154_aa3a_1166317c947d.slice/crio-0f37d146110ac280e20e02e76ef38a5680fe367d747aead570b1744bbb1b7df6 WatchSource:0}: Error finding container 0f37d146110ac280e20e02e76ef38a5680fe367d747aead570b1744bbb1b7df6: Status 404 returned error can't find the container with id 0f37d146110ac280e20e02e76ef38a5680fe367d747aead570b1744bbb1b7df6 Dec 03 14:31:14 crc kubenswrapper[5004]: I1203 14:31:14.797303 5004 generic.go:334] "Generic (PLEG): container finished" podID="44391f91-0b9c-4154-aa3a-1166317c947d" containerID="2c792883674dc5d8e6992bf548cbb02c4cf6964a990c5ca9893e717551342cad" exitCode=0 Dec 03 14:31:14 crc kubenswrapper[5004]: I1203 14:31:14.797735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerDied","Data":"2c792883674dc5d8e6992bf548cbb02c4cf6964a990c5ca9893e717551342cad"} Dec 03 14:31:14 crc kubenswrapper[5004]: I1203 14:31:14.797815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerStarted","Data":"0f37d146110ac280e20e02e76ef38a5680fe367d747aead570b1744bbb1b7df6"} Dec 03 14:31:15 crc kubenswrapper[5004]: I1203 14:31:15.810800 5004 generic.go:334] "Generic (PLEG): container finished" podID="44391f91-0b9c-4154-aa3a-1166317c947d" containerID="f7fd9f1ca9b867a02e5f3f27e292ba571510bb9bc1e8862b3d8f48a243d2e11d" exitCode=0 Dec 03 14:31:15 crc kubenswrapper[5004]: I1203 14:31:15.811052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerDied","Data":"f7fd9f1ca9b867a02e5f3f27e292ba571510bb9bc1e8862b3d8f48a243d2e11d"} Dec 03 14:31:16 crc kubenswrapper[5004]: I1203 14:31:16.824711 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerStarted","Data":"a93c82eb096fc025353c4c2d1a9b59cd049793d395d61ccf9e3da197a7a1928c"} Dec 03 14:31:16 crc kubenswrapper[5004]: I1203 14:31:16.844278 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdx29" podStartSLOduration=3.381968366 podStartE2EDuration="4.844260307s" podCreationTimestamp="2025-12-03 14:31:12 +0000 UTC" firstStartedPulling="2025-12-03 14:31:14.799708266 +0000 UTC m=+1487.548678502" lastFinishedPulling="2025-12-03 14:31:16.262000207 +0000 UTC m=+1489.010970443" observedRunningTime="2025-12-03 14:31:16.84368351 +0000 UTC m=+1489.592653746" watchObservedRunningTime="2025-12-03 14:31:16.844260307 +0000 UTC m=+1489.593230543" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.204018 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-78lk4" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.266136 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.266370 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="dnsmasq-dns" containerID="cri-o://26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541" gracePeriod=10 Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.771432 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.835745 5004 generic.go:334] "Generic (PLEG): container finished" podID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerID="26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541" exitCode=0 Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.836083 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" event={"ID":"bc197f4a-f6bb-4b16-a471-ccb2abb42e72","Type":"ContainerDied","Data":"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541"} Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.836141 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" event={"ID":"bc197f4a-f6bb-4b16-a471-ccb2abb42e72","Type":"ContainerDied","Data":"cd9132b56ee1ac8f93f1c54cb97c4dae3826f62b983913fc98f367c377ca070b"} Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.836144 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vjvrf" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.836164 5004 scope.go:117] "RemoveContainer" containerID="26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.860090 5004 scope.go:117] "RemoveContainer" containerID="891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.894360 5004 scope.go:117] "RemoveContainer" containerID="26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541" Dec 03 14:31:17 crc kubenswrapper[5004]: E1203 14:31:17.895056 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541\": container with ID starting with 26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541 not found: ID does not exist" containerID="26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.895087 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541"} err="failed to get container status \"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541\": rpc error: code = NotFound desc = could not find container \"26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541\": container with ID starting with 26e15ded4ea4a57818b7c6d79fdd211955383471bf2507322845bb327fa3e541 not found: ID does not exist" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.895108 5004 scope.go:117] "RemoveContainer" containerID="891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76" Dec 03 14:31:17 crc kubenswrapper[5004]: E1203 14:31:17.895385 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76\": container with ID starting with 891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76 not found: ID does not exist" containerID="891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.895434 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76"} err="failed to get container status \"891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76\": rpc error: code = NotFound desc = could not find container \"891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76\": container with ID starting with 891f56434be06946d6e20db2f20e00ce40d78551f87295d46af94b4b578e1c76 not found: ID does not exist" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911153 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911217 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911417 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911450 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911559 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911642 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v22rp\" (UniqueName: \"kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.911676 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config\") pod \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\" (UID: \"bc197f4a-f6bb-4b16-a471-ccb2abb42e72\") " Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.919850 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp" (OuterVolumeSpecName: "kube-api-access-v22rp") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "kube-api-access-v22rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.976289 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.978912 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.986544 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.987930 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config" (OuterVolumeSpecName: "config") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:17 crc kubenswrapper[5004]: I1203 14:31:17.988349 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.004881 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc197f4a-f6bb-4b16-a471-ccb2abb42e72" (UID: "bc197f4a-f6bb-4b16-a471-ccb2abb42e72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.013935 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.013987 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.014001 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.014017 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.014031 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.014047 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v22rp\" (UniqueName: \"kubernetes.io/projected/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-kube-api-access-v22rp\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.014061 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc197f4a-f6bb-4b16-a471-ccb2abb42e72-config\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.180152 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.190552 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vjvrf"] Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.236041 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.236088 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.285927 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:18 crc kubenswrapper[5004]: I1203 14:31:18.895716 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:19 crc kubenswrapper[5004]: I1203 14:31:19.626435 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" path="/var/lib/kubelet/pods/bc197f4a-f6bb-4b16-a471-ccb2abb42e72/volumes" Dec 03 14:31:20 crc kubenswrapper[5004]: I1203 14:31:20.052062 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:21 crc kubenswrapper[5004]: I1203 14:31:21.873087 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q9457" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="registry-server" containerID="cri-o://54fb80c81ba96ea25a3d66760bbd21c3437ed9dfdf6b394816a0cb74ddba2399" gracePeriod=2 Dec 03 14:31:22 crc kubenswrapper[5004]: I1203 14:31:22.887664 5004 generic.go:334] "Generic (PLEG): container finished" podID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerID="54fb80c81ba96ea25a3d66760bbd21c3437ed9dfdf6b394816a0cb74ddba2399" exitCode=0 Dec 03 14:31:22 crc kubenswrapper[5004]: I1203 14:31:22.887731 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerDied","Data":"54fb80c81ba96ea25a3d66760bbd21c3437ed9dfdf6b394816a0cb74ddba2399"} Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.006433 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.006494 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.063753 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.648285 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.724736 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities\") pod \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.725452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities" (OuterVolumeSpecName: "utilities") pod "87e7d38c-eba3-4ab4-be2b-56a79b837b82" (UID: "87e7d38c-eba3-4ab4-be2b-56a79b837b82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.725578 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content\") pod \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.729120 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxq69\" (UniqueName: \"kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69\") pod \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\" (UID: \"87e7d38c-eba3-4ab4-be2b-56a79b837b82\") " Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.730049 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.735971 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69" (OuterVolumeSpecName: "kube-api-access-rxq69") pod "87e7d38c-eba3-4ab4-be2b-56a79b837b82" (UID: "87e7d38c-eba3-4ab4-be2b-56a79b837b82"). InnerVolumeSpecName "kube-api-access-rxq69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.745879 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87e7d38c-eba3-4ab4-be2b-56a79b837b82" (UID: "87e7d38c-eba3-4ab4-be2b-56a79b837b82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.832171 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e7d38c-eba3-4ab4-be2b-56a79b837b82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.832207 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxq69\" (UniqueName: \"kubernetes.io/projected/87e7d38c-eba3-4ab4-be2b-56a79b837b82-kube-api-access-rxq69\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.898727 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9457" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.898723 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9457" event={"ID":"87e7d38c-eba3-4ab4-be2b-56a79b837b82","Type":"ContainerDied","Data":"441702e6353255499fd6216c497705584efb9b8285af351fb954959233458a14"} Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.898804 5004 scope.go:117] "RemoveContainer" containerID="54fb80c81ba96ea25a3d66760bbd21c3437ed9dfdf6b394816a0cb74ddba2399" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.935697 5004 scope.go:117] "RemoveContainer" containerID="e1d60141327563839520aea5f6058ba1f18c976259a57d6416d91e5ef2a4331a" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.937991 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.946405 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9457"] Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.957550 5004 scope.go:117] "RemoveContainer" containerID="37a275a57cacffa79a42bbf6488a8c139ac431b95c656a6ffc31a9bde8c5559b" Dec 03 14:31:23 crc kubenswrapper[5004]: I1203 14:31:23.983317 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:25 crc kubenswrapper[5004]: I1203 14:31:25.625568 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" path="/var/lib/kubelet/pods/87e7d38c-eba3-4ab4-be2b-56a79b837b82/volumes" Dec 03 14:31:26 crc kubenswrapper[5004]: I1203 14:31:26.250622 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:26 crc kubenswrapper[5004]: I1203 14:31:26.251438 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdx29" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="registry-server" containerID="cri-o://a93c82eb096fc025353c4c2d1a9b59cd049793d395d61ccf9e3da197a7a1928c" gracePeriod=2 Dec 03 14:31:27 crc kubenswrapper[5004]: I1203 14:31:27.946552 5004 generic.go:334] "Generic (PLEG): container finished" podID="44391f91-0b9c-4154-aa3a-1166317c947d" containerID="a93c82eb096fc025353c4c2d1a9b59cd049793d395d61ccf9e3da197a7a1928c" exitCode=0 Dec 03 14:31:27 crc kubenswrapper[5004]: I1203 14:31:27.947013 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerDied","Data":"a93c82eb096fc025353c4c2d1a9b59cd049793d395d61ccf9e3da197a7a1928c"} Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.223549 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.326280 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfgnn\" (UniqueName: \"kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn\") pod \"44391f91-0b9c-4154-aa3a-1166317c947d\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.326633 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content\") pod \"44391f91-0b9c-4154-aa3a-1166317c947d\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.326705 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities\") pod \"44391f91-0b9c-4154-aa3a-1166317c947d\" (UID: \"44391f91-0b9c-4154-aa3a-1166317c947d\") " Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.327454 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities" (OuterVolumeSpecName: "utilities") pod "44391f91-0b9c-4154-aa3a-1166317c947d" (UID: "44391f91-0b9c-4154-aa3a-1166317c947d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.328347 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.341318 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn" (OuterVolumeSpecName: "kube-api-access-qfgnn") pod "44391f91-0b9c-4154-aa3a-1166317c947d" (UID: "44391f91-0b9c-4154-aa3a-1166317c947d"). InnerVolumeSpecName "kube-api-access-qfgnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.383503 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44391f91-0b9c-4154-aa3a-1166317c947d" (UID: "44391f91-0b9c-4154-aa3a-1166317c947d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.430540 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfgnn\" (UniqueName: \"kubernetes.io/projected/44391f91-0b9c-4154-aa3a-1166317c947d-kube-api-access-qfgnn\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.430577 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44391f91-0b9c-4154-aa3a-1166317c947d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.957971 5004 generic.go:334] "Generic (PLEG): container finished" podID="8febd608-4e34-4b42-bcf7-27dbf88b7a09" containerID="78d2f4dad8fcb385c45c034b856fefd4005052b6b5db6fccdaef583c5e32267b" exitCode=0 Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.958060 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8febd608-4e34-4b42-bcf7-27dbf88b7a09","Type":"ContainerDied","Data":"78d2f4dad8fcb385c45c034b856fefd4005052b6b5db6fccdaef583c5e32267b"} Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.960562 5004 generic.go:334] "Generic (PLEG): container finished" podID="c10a8bdc-f17c-4090-8c82-dcce9b638577" containerID="158a9dbe7167a92131db42852f205f296ea7de1f8dc9d076eb8df9c9ad249685" exitCode=0 Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.960637 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10a8bdc-f17c-4090-8c82-dcce9b638577","Type":"ContainerDied","Data":"158a9dbe7167a92131db42852f205f296ea7de1f8dc9d076eb8df9c9ad249685"} Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.969160 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdx29" event={"ID":"44391f91-0b9c-4154-aa3a-1166317c947d","Type":"ContainerDied","Data":"0f37d146110ac280e20e02e76ef38a5680fe367d747aead570b1744bbb1b7df6"} Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.969223 5004 scope.go:117] "RemoveContainer" containerID="a93c82eb096fc025353c4c2d1a9b59cd049793d395d61ccf9e3da197a7a1928c" Dec 03 14:31:28 crc kubenswrapper[5004]: I1203 14:31:28.969405 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdx29" Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.174114 5004 scope.go:117] "RemoveContainer" containerID="f7fd9f1ca9b867a02e5f3f27e292ba571510bb9bc1e8862b3d8f48a243d2e11d" Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.178027 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.222119 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdx29"] Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.289237 5004 scope.go:117] "RemoveContainer" containerID="2c792883674dc5d8e6992bf548cbb02c4cf6964a990c5ca9893e717551342cad" Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.624333 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" path="/var/lib/kubelet/pods/44391f91-0b9c-4154-aa3a-1166317c947d/volumes" Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.980364 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10a8bdc-f17c-4090-8c82-dcce9b638577","Type":"ContainerStarted","Data":"8d08de54082ee80ee1ece0de0f953673907cbc61494ca3364a98dc8a6a891e53"} Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.980655 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.986278 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8febd608-4e34-4b42-bcf7-27dbf88b7a09","Type":"ContainerStarted","Data":"5bc0247a747eb544a4459c94a3daf95fb52ca9341a81af8a23dc959f12feaa4e"} Dec 03 14:31:29 crc kubenswrapper[5004]: I1203 14:31:29.986512 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.013133 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.013117268 podStartE2EDuration="37.013117268s" podCreationTimestamp="2025-12-03 14:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:31:30.007642751 +0000 UTC m=+1502.756612987" watchObservedRunningTime="2025-12-03 14:31:30.013117268 +0000 UTC m=+1502.762087504" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.034646 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.034623535 podStartE2EDuration="37.034623535s" podCreationTimestamp="2025-12-03 14:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:31:30.030117215 +0000 UTC m=+1502.779087451" watchObservedRunningTime="2025-12-03 14:31:30.034623535 +0000 UTC m=+1502.783593771" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.186159 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh"] Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.186979 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="dnsmasq-dns" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187003 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="dnsmasq-dns" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187019 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="init" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187026 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="init" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187062 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="extract-utilities" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187073 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="extract-utilities" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187087 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="extract-content" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187094 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="extract-content" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187109 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187115 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187139 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="extract-content" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187147 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="extract-content" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187159 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="extract-utilities" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187166 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="extract-utilities" Dec 03 14:31:30 crc kubenswrapper[5004]: E1203 14:31:30.187181 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.187188 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.194339 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc197f4a-f6bb-4b16-a471-ccb2abb42e72" containerName="dnsmasq-dns" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.194399 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="44391f91-0b9c-4154-aa3a-1166317c947d" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.194419 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e7d38c-eba3-4ab4-be2b-56a79b837b82" containerName="registry-server" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.195603 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.198159 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.198638 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.199595 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.200139 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.200606 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh"] Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.296554 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5wsb\" (UniqueName: \"kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.296998 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.297179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.297218 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.398877 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.398959 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.398996 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.399065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5wsb\" (UniqueName: \"kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.405316 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.406753 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.410324 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.417701 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5wsb\" (UniqueName: \"kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:30 crc kubenswrapper[5004]: I1203 14:31:30.532363 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:31 crc kubenswrapper[5004]: W1203 14:31:31.153434 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfada131d_446d_4819_b137_48910402240f.slice/crio-ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066 WatchSource:0}: Error finding container ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066: Status 404 returned error can't find the container with id ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066 Dec 03 14:31:31 crc kubenswrapper[5004]: I1203 14:31:31.156249 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh"] Dec 03 14:31:32 crc kubenswrapper[5004]: I1203 14:31:32.010142 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" event={"ID":"fada131d-446d-4819-b137-48910402240f","Type":"ContainerStarted","Data":"ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066"} Dec 03 14:31:43 crc kubenswrapper[5004]: I1203 14:31:43.125848 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" event={"ID":"fada131d-446d-4819-b137-48910402240f","Type":"ContainerStarted","Data":"1094d2055e419c665044a30d666da0f23697e9d0f39874ca5802ddd3b6ef7200"} Dec 03 14:31:43 crc kubenswrapper[5004]: I1203 14:31:43.154422 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" podStartSLOduration=2.328215626 podStartE2EDuration="13.154397868s" podCreationTimestamp="2025-12-03 14:31:30 +0000 UTC" firstStartedPulling="2025-12-03 14:31:31.156714117 +0000 UTC m=+1503.905684353" lastFinishedPulling="2025-12-03 14:31:41.982896359 +0000 UTC m=+1514.731866595" observedRunningTime="2025-12-03 14:31:43.145649077 +0000 UTC m=+1515.894619313" watchObservedRunningTime="2025-12-03 14:31:43.154397868 +0000 UTC m=+1515.903368114" Dec 03 14:31:44 crc kubenswrapper[5004]: I1203 14:31:44.141064 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 14:31:44 crc kubenswrapper[5004]: I1203 14:31:44.142228 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 14:31:55 crc kubenswrapper[5004]: I1203 14:31:55.244358 5004 generic.go:334] "Generic (PLEG): container finished" podID="fada131d-446d-4819-b137-48910402240f" containerID="1094d2055e419c665044a30d666da0f23697e9d0f39874ca5802ddd3b6ef7200" exitCode=0 Dec 03 14:31:55 crc kubenswrapper[5004]: I1203 14:31:55.244939 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" event={"ID":"fada131d-446d-4819-b137-48910402240f","Type":"ContainerDied","Data":"1094d2055e419c665044a30d666da0f23697e9d0f39874ca5802ddd3b6ef7200"} Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.763940 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.773427 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5wsb\" (UniqueName: \"kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb\") pod \"fada131d-446d-4819-b137-48910402240f\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.773575 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory\") pod \"fada131d-446d-4819-b137-48910402240f\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.773713 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key\") pod \"fada131d-446d-4819-b137-48910402240f\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.788887 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb" (OuterVolumeSpecName: "kube-api-access-w5wsb") pod "fada131d-446d-4819-b137-48910402240f" (UID: "fada131d-446d-4819-b137-48910402240f"). InnerVolumeSpecName "kube-api-access-w5wsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.811090 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory" (OuterVolumeSpecName: "inventory") pod "fada131d-446d-4819-b137-48910402240f" (UID: "fada131d-446d-4819-b137-48910402240f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.825328 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fada131d-446d-4819-b137-48910402240f" (UID: "fada131d-446d-4819-b137-48910402240f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.875181 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle\") pod \"fada131d-446d-4819-b137-48910402240f\" (UID: \"fada131d-446d-4819-b137-48910402240f\") " Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.875613 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.875637 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5wsb\" (UniqueName: \"kubernetes.io/projected/fada131d-446d-4819-b137-48910402240f-kube-api-access-w5wsb\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.875652 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.879422 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fada131d-446d-4819-b137-48910402240f" (UID: "fada131d-446d-4819-b137-48910402240f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:31:56 crc kubenswrapper[5004]: I1203 14:31:56.976786 5004 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada131d-446d-4819-b137-48910402240f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.267013 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" event={"ID":"fada131d-446d-4819-b137-48910402240f","Type":"ContainerDied","Data":"ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066"} Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.267065 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea9432a829d72784b7ff03f768b4ce05163be7ab5bdfeaa897a27426f4bd8066" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.267108 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.346176 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq"] Dec 03 14:31:57 crc kubenswrapper[5004]: E1203 14:31:57.348004 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fada131d-446d-4819-b137-48910402240f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.348029 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fada131d-446d-4819-b137-48910402240f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.348303 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fada131d-446d-4819-b137-48910402240f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.349184 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.351529 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.351757 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.351948 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.352111 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.356814 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq"] Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.385096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.385214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx6ql\" (UniqueName: \"kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.385242 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.487329 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.487438 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx6ql\" (UniqueName: \"kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.487462 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.491238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.491314 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.503182 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx6ql\" (UniqueName: \"kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4skhq\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:57 crc kubenswrapper[5004]: I1203 14:31:57.667392 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:31:58 crc kubenswrapper[5004]: W1203 14:31:58.226421 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c867273_ae64_48f8_85f1_4eb5624b9dea.slice/crio-bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992 WatchSource:0}: Error finding container bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992: Status 404 returned error can't find the container with id bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992 Dec 03 14:31:58 crc kubenswrapper[5004]: I1203 14:31:58.236086 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq"] Dec 03 14:31:58 crc kubenswrapper[5004]: I1203 14:31:58.276056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" event={"ID":"5c867273-ae64-48f8-85f1-4eb5624b9dea","Type":"ContainerStarted","Data":"bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992"} Dec 03 14:31:59 crc kubenswrapper[5004]: I1203 14:31:59.293839 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" event={"ID":"5c867273-ae64-48f8-85f1-4eb5624b9dea","Type":"ContainerStarted","Data":"c239186d0c0f35be6ae99905b9979ce1f1ecc85911dc8ea087ac1ee70180545a"} Dec 03 14:31:59 crc kubenswrapper[5004]: I1203 14:31:59.326988 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" podStartSLOduration=1.555774948 podStartE2EDuration="2.326969996s" podCreationTimestamp="2025-12-03 14:31:57 +0000 UTC" firstStartedPulling="2025-12-03 14:31:58.229874731 +0000 UTC m=+1530.978844967" lastFinishedPulling="2025-12-03 14:31:59.001069779 +0000 UTC m=+1531.750040015" observedRunningTime="2025-12-03 14:31:59.322369504 +0000 UTC m=+1532.071339750" watchObservedRunningTime="2025-12-03 14:31:59.326969996 +0000 UTC m=+1532.075940232" Dec 03 14:32:02 crc kubenswrapper[5004]: I1203 14:32:02.320579 5004 generic.go:334] "Generic (PLEG): container finished" podID="5c867273-ae64-48f8-85f1-4eb5624b9dea" containerID="c239186d0c0f35be6ae99905b9979ce1f1ecc85911dc8ea087ac1ee70180545a" exitCode=0 Dec 03 14:32:02 crc kubenswrapper[5004]: I1203 14:32:02.320682 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" event={"ID":"5c867273-ae64-48f8-85f1-4eb5624b9dea","Type":"ContainerDied","Data":"c239186d0c0f35be6ae99905b9979ce1f1ecc85911dc8ea087ac1ee70180545a"} Dec 03 14:32:03 crc kubenswrapper[5004]: I1203 14:32:03.893990 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.002817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key\") pod \"5c867273-ae64-48f8-85f1-4eb5624b9dea\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.002883 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx6ql\" (UniqueName: \"kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql\") pod \"5c867273-ae64-48f8-85f1-4eb5624b9dea\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.003101 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory\") pod \"5c867273-ae64-48f8-85f1-4eb5624b9dea\" (UID: \"5c867273-ae64-48f8-85f1-4eb5624b9dea\") " Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.022288 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql" (OuterVolumeSpecName: "kube-api-access-qx6ql") pod "5c867273-ae64-48f8-85f1-4eb5624b9dea" (UID: "5c867273-ae64-48f8-85f1-4eb5624b9dea"). InnerVolumeSpecName "kube-api-access-qx6ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.034719 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c867273-ae64-48f8-85f1-4eb5624b9dea" (UID: "5c867273-ae64-48f8-85f1-4eb5624b9dea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.036245 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory" (OuterVolumeSpecName: "inventory") pod "5c867273-ae64-48f8-85f1-4eb5624b9dea" (UID: "5c867273-ae64-48f8-85f1-4eb5624b9dea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.105785 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.106020 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c867273-ae64-48f8-85f1-4eb5624b9dea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.106111 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx6ql\" (UniqueName: \"kubernetes.io/projected/5c867273-ae64-48f8-85f1-4eb5624b9dea-kube-api-access-qx6ql\") on node \"crc\" DevicePath \"\"" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.341768 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" event={"ID":"5c867273-ae64-48f8-85f1-4eb5624b9dea","Type":"ContainerDied","Data":"bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992"} Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.341814 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc9718373130d40fdee46432a90d6395dd0e6d4f4e0c66820ade93f29e7e992" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.341894 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4skhq" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.428643 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4"] Dec 03 14:32:04 crc kubenswrapper[5004]: E1203 14:32:04.429428 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c867273-ae64-48f8-85f1-4eb5624b9dea" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.429456 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c867273-ae64-48f8-85f1-4eb5624b9dea" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.429956 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c867273-ae64-48f8-85f1-4eb5624b9dea" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.431016 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.436667 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.436674 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.437056 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.437193 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.453843 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4"] Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.613277 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.613561 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.613637 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjgf\" (UniqueName: \"kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.613752 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.715719 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjgf\" (UniqueName: \"kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.716095 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.716411 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.716594 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.721134 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.721336 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.721911 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.750442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjgf\" (UniqueName: \"kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:04 crc kubenswrapper[5004]: I1203 14:32:04.770735 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:32:05 crc kubenswrapper[5004]: I1203 14:32:05.291534 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4"] Dec 03 14:32:07 crc kubenswrapper[5004]: W1203 14:32:07.010184 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a8c5468_695c_4238_9cae_3b010f6987ae.slice/crio-f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06 WatchSource:0}: Error finding container f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06: Status 404 returned error can't find the container with id f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06 Dec 03 14:32:07 crc kubenswrapper[5004]: I1203 14:32:07.385312 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" event={"ID":"7a8c5468-695c-4238-9cae-3b010f6987ae","Type":"ContainerStarted","Data":"f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06"} Dec 03 14:32:09 crc kubenswrapper[5004]: I1203 14:32:09.405095 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" event={"ID":"7a8c5468-695c-4238-9cae-3b010f6987ae","Type":"ContainerStarted","Data":"d3db6a619e42190a32608e612403c4339e9d87ff13008edea0e65daa3ded6898"} Dec 03 14:32:09 crc kubenswrapper[5004]: I1203 14:32:09.423646 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" podStartSLOduration=4.141700028 podStartE2EDuration="5.423625995s" podCreationTimestamp="2025-12-03 14:32:04 +0000 UTC" firstStartedPulling="2025-12-03 14:32:07.01277218 +0000 UTC m=+1539.761742416" lastFinishedPulling="2025-12-03 14:32:08.294698147 +0000 UTC m=+1541.043668383" observedRunningTime="2025-12-03 14:32:09.419160237 +0000 UTC m=+1542.168130483" watchObservedRunningTime="2025-12-03 14:32:09.423625995 +0000 UTC m=+1542.172596231" Dec 03 14:32:34 crc kubenswrapper[5004]: I1203 14:32:34.727723 5004 scope.go:117] "RemoveContainer" containerID="1904921655fa97ba0791e953363ab31d01b4a4bb4ed587fb06b0780a8b86782b" Dec 03 14:32:34 crc kubenswrapper[5004]: I1203 14:32:34.756372 5004 scope.go:117] "RemoveContainer" containerID="bf1b06170aed0ce5b23ad342d4c2bc34877c01a1a68b1090d9237b6aaab8434b" Dec 03 14:32:34 crc kubenswrapper[5004]: I1203 14:32:34.801285 5004 scope.go:117] "RemoveContainer" containerID="67f6eb9d3b4b1f8845468378f3a8e0d565afba742da65172f26ef69921c09599" Dec 03 14:32:34 crc kubenswrapper[5004]: I1203 14:32:34.853626 5004 scope.go:117] "RemoveContainer" containerID="465d86b794a1263aacd13fbd7d927b3cd1c4a7437478104ba4ef80557d1b03b0" Dec 03 14:32:34 crc kubenswrapper[5004]: I1203 14:32:34.875941 5004 scope.go:117] "RemoveContainer" containerID="1fc0a9e367fdeabd5336b2b4f36abd921fec677a4c3548ee811b27f9b7633f1d" Dec 03 14:33:22 crc kubenswrapper[5004]: I1203 14:33:22.824238 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:33:22 crc kubenswrapper[5004]: I1203 14:33:22.824982 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.022462 5004 scope.go:117] "RemoveContainer" containerID="c0ae94b2f5a01a864e7882903696e9bda1debfed67efcbf3809c711c43daa91c" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.059034 5004 scope.go:117] "RemoveContainer" containerID="dd591f0c931b6816b155aec5b3c908f746848a520186c68192a606ea930fb2ac" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.082902 5004 scope.go:117] "RemoveContainer" containerID="5d97190441b12e18ba6b219a7b4fcf6e8f6e174ece11a627f7a20ce5b56993a0" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.101155 5004 scope.go:117] "RemoveContainer" containerID="9df7744ac7308982f5849dadf0c0067607f2893d9d53424d70637e024304f2cb" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.121515 5004 scope.go:117] "RemoveContainer" containerID="4d5d8db4142cd1542e6c685d26ecb00f2160b99c572036ea261da4c5a95253d9" Dec 03 14:33:35 crc kubenswrapper[5004]: I1203 14:33:35.143281 5004 scope.go:117] "RemoveContainer" containerID="6d53e5e3c44e90555a85b91a5271d9a90c1c05995acaa13cf2f1c3fca4936cca" Dec 03 14:33:52 crc kubenswrapper[5004]: I1203 14:33:52.824806 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:33:52 crc kubenswrapper[5004]: I1203 14:33:52.825804 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:34:22 crc kubenswrapper[5004]: I1203 14:34:22.824826 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:34:22 crc kubenswrapper[5004]: I1203 14:34:22.825372 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:34:22 crc kubenswrapper[5004]: I1203 14:34:22.825417 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:34:22 crc kubenswrapper[5004]: I1203 14:34:22.826086 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:34:22 crc kubenswrapper[5004]: I1203 14:34:22.826141 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" gracePeriod=600 Dec 03 14:34:23 crc kubenswrapper[5004]: E1203 14:34:23.125036 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:34:23 crc kubenswrapper[5004]: I1203 14:34:23.675431 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" exitCode=0 Dec 03 14:34:23 crc kubenswrapper[5004]: I1203 14:34:23.675473 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb"} Dec 03 14:34:23 crc kubenswrapper[5004]: I1203 14:34:23.675511 5004 scope.go:117] "RemoveContainer" containerID="6f8a1e811ed63200415b8b55aa6ea551896c03ef3f2d83a89506ba6c3ebccf0d" Dec 03 14:34:23 crc kubenswrapper[5004]: I1203 14:34:23.676208 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:34:23 crc kubenswrapper[5004]: E1203 14:34:23.676462 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:34:38 crc kubenswrapper[5004]: I1203 14:34:38.613395 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:34:38 crc kubenswrapper[5004]: E1203 14:34:38.614233 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:34:49 crc kubenswrapper[5004]: I1203 14:34:49.613473 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:34:49 crc kubenswrapper[5004]: E1203 14:34:49.614467 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:02 crc kubenswrapper[5004]: I1203 14:35:02.615607 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:35:02 crc kubenswrapper[5004]: E1203 14:35:02.616919 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:16 crc kubenswrapper[5004]: I1203 14:35:16.613357 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:35:16 crc kubenswrapper[5004]: E1203 14:35:16.615015 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:31 crc kubenswrapper[5004]: I1203 14:35:31.612962 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:35:31 crc kubenswrapper[5004]: E1203 14:35:31.613767 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:36 crc kubenswrapper[5004]: I1203 14:35:36.406620 5004 generic.go:334] "Generic (PLEG): container finished" podID="7a8c5468-695c-4238-9cae-3b010f6987ae" containerID="d3db6a619e42190a32608e612403c4339e9d87ff13008edea0e65daa3ded6898" exitCode=0 Dec 03 14:35:36 crc kubenswrapper[5004]: I1203 14:35:36.406709 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" event={"ID":"7a8c5468-695c-4238-9cae-3b010f6987ae","Type":"ContainerDied","Data":"d3db6a619e42190a32608e612403c4339e9d87ff13008edea0e65daa3ded6898"} Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.890747 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.942583 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory\") pod \"7a8c5468-695c-4238-9cae-3b010f6987ae\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.942679 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhjgf\" (UniqueName: \"kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf\") pod \"7a8c5468-695c-4238-9cae-3b010f6987ae\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.942701 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle\") pod \"7a8c5468-695c-4238-9cae-3b010f6987ae\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.942801 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key\") pod \"7a8c5468-695c-4238-9cae-3b010f6987ae\" (UID: \"7a8c5468-695c-4238-9cae-3b010f6987ae\") " Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.948848 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf" (OuterVolumeSpecName: "kube-api-access-mhjgf") pod "7a8c5468-695c-4238-9cae-3b010f6987ae" (UID: "7a8c5468-695c-4238-9cae-3b010f6987ae"). InnerVolumeSpecName "kube-api-access-mhjgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.955067 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7a8c5468-695c-4238-9cae-3b010f6987ae" (UID: "7a8c5468-695c-4238-9cae-3b010f6987ae"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.972453 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory" (OuterVolumeSpecName: "inventory") pod "7a8c5468-695c-4238-9cae-3b010f6987ae" (UID: "7a8c5468-695c-4238-9cae-3b010f6987ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:37 crc kubenswrapper[5004]: I1203 14:35:37.973700 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7a8c5468-695c-4238-9cae-3b010f6987ae" (UID: "7a8c5468-695c-4238-9cae-3b010f6987ae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.045533 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.045567 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhjgf\" (UniqueName: \"kubernetes.io/projected/7a8c5468-695c-4238-9cae-3b010f6987ae-kube-api-access-mhjgf\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.045579 5004 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.045586 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a8c5468-695c-4238-9cae-3b010f6987ae-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.445517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" event={"ID":"7a8c5468-695c-4238-9cae-3b010f6987ae","Type":"ContainerDied","Data":"f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06"} Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.445570 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f472a7dd0189ac52bcf28739f36e57eed57fdf0fd6c18b109366ea1211455a06" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.445574 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.527174 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt"] Dec 03 14:35:38 crc kubenswrapper[5004]: E1203 14:35:38.527969 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8c5468-695c-4238-9cae-3b010f6987ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.528014 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8c5468-695c-4238-9cae-3b010f6987ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.528550 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8c5468-695c-4238-9cae-3b010f6987ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.530026 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.532226 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.532454 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.532588 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.532725 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.540137 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt"] Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.657833 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.657896 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mbq\" (UniqueName: \"kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.657924 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.760260 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.760825 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mbq\" (UniqueName: \"kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.760933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.765772 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.766135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.779770 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mbq\" (UniqueName: \"kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:38 crc kubenswrapper[5004]: I1203 14:35:38.857568 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:35:39 crc kubenswrapper[5004]: I1203 14:35:39.368528 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt"] Dec 03 14:35:39 crc kubenswrapper[5004]: W1203 14:35:39.370463 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf69ec7_959a_404b_9bae_24bc3c528c28.slice/crio-e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623 WatchSource:0}: Error finding container e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623: Status 404 returned error can't find the container with id e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623 Dec 03 14:35:39 crc kubenswrapper[5004]: I1203 14:35:39.372887 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:35:39 crc kubenswrapper[5004]: I1203 14:35:39.453573 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" event={"ID":"faf69ec7-959a-404b-9bae-24bc3c528c28","Type":"ContainerStarted","Data":"e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623"} Dec 03 14:35:42 crc kubenswrapper[5004]: I1203 14:35:42.479918 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" event={"ID":"faf69ec7-959a-404b-9bae-24bc3c528c28","Type":"ContainerStarted","Data":"fb7b9cb621267933efc646181a72cef1e37bd14ce4dc6152e7651fc177a948b2"} Dec 03 14:35:42 crc kubenswrapper[5004]: I1203 14:35:42.522054 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" podStartSLOduration=2.733211712 podStartE2EDuration="4.522031575s" podCreationTimestamp="2025-12-03 14:35:38 +0000 UTC" firstStartedPulling="2025-12-03 14:35:39.372642294 +0000 UTC m=+1752.121612520" lastFinishedPulling="2025-12-03 14:35:41.161462107 +0000 UTC m=+1753.910432383" observedRunningTime="2025-12-03 14:35:42.5030692 +0000 UTC m=+1755.252039466" watchObservedRunningTime="2025-12-03 14:35:42.522031575 +0000 UTC m=+1755.271001801" Dec 03 14:35:42 crc kubenswrapper[5004]: I1203 14:35:42.613809 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:35:42 crc kubenswrapper[5004]: E1203 14:35:42.614232 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:51 crc kubenswrapper[5004]: I1203 14:35:51.042699 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2228-account-create-update-zr2sq"] Dec 03 14:35:51 crc kubenswrapper[5004]: I1203 14:35:51.053109 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2228-account-create-update-zr2sq"] Dec 03 14:35:51 crc kubenswrapper[5004]: I1203 14:35:51.627529 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30fd1af-dedb-4b6c-a3fd-5a327ac580e4" path="/var/lib/kubelet/pods/d30fd1af-dedb-4b6c-a3fd-5a327ac580e4/volumes" Dec 03 14:35:56 crc kubenswrapper[5004]: I1203 14:35:56.612826 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:35:56 crc kubenswrapper[5004]: E1203 14:35:56.613809 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.056222 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4524-account-create-update-hvf8r"] Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.067276 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4wszb"] Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.077761 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pdq5g"] Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.088701 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4524-account-create-update-hvf8r"] Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.101180 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pdq5g"] Dec 03 14:35:58 crc kubenswrapper[5004]: I1203 14:35:58.109173 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4wszb"] Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.038307 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-chkcv"] Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.055544 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2c8b-account-create-update-85hzs"] Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.066777 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-chkcv"] Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.074236 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2c8b-account-create-update-85hzs"] Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.627292 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d44f905-8c97-4b62-89e4-a3929a8a2042" path="/var/lib/kubelet/pods/6d44f905-8c97-4b62-89e4-a3929a8a2042/volumes" Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.628470 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee" path="/var/lib/kubelet/pods/c0e4ef9c-cb19-43bd-9d73-ecf7758b07ee/volumes" Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.629606 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bf012d-617e-44d4-a5c3-6101921a5ece" path="/var/lib/kubelet/pods/c9bf012d-617e-44d4-a5c3-6101921a5ece/volumes" Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.630693 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b" path="/var/lib/kubelet/pods/e0c5a19d-1c2f-42a4-b6d0-683c02a7af8b/volumes" Dec 03 14:35:59 crc kubenswrapper[5004]: I1203 14:35:59.632692 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd567a61-b577-4119-8bd1-59039ffc45e8" path="/var/lib/kubelet/pods/fd567a61-b577-4119-8bd1-59039ffc45e8/volumes" Dec 03 14:36:08 crc kubenswrapper[5004]: I1203 14:36:08.613320 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:36:08 crc kubenswrapper[5004]: E1203 14:36:08.614085 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.045772 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e791-account-create-update-bgrw9"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.056098 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e16c-account-create-update-7d2pm"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.067789 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-fgpcz"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.075121 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-v49l9"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.082717 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e791-account-create-update-bgrw9"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.089965 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-v49l9"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.097737 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-fgpcz"] Dec 03 14:36:16 crc kubenswrapper[5004]: I1203 14:36:16.105022 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e16c-account-create-update-7d2pm"] Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.023640 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0e9a-account-create-update-z2skj"] Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.032180 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2dh4h"] Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.042914 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0e9a-account-create-update-z2skj"] Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.050643 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2dh4h"] Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.624370 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a45345-f85d-4134-87d4-70377be8f7cf" path="/var/lib/kubelet/pods/08a45345-f85d-4134-87d4-70377be8f7cf/volumes" Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.625089 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f6fb8e-6887-4ca5-8fc6-3c44db29d84d" path="/var/lib/kubelet/pods/24f6fb8e-6887-4ca5-8fc6-3c44db29d84d/volumes" Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.625760 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811a96c5-1501-47c0-a372-702a55e5182f" path="/var/lib/kubelet/pods/811a96c5-1501-47c0-a372-702a55e5182f/volumes" Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.626457 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8590fe31-fb29-40a9-b61e-569709bf9008" path="/var/lib/kubelet/pods/8590fe31-fb29-40a9-b61e-569709bf9008/volumes" Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.627654 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de6f3fa-11a3-4730-8424-47207b77ca2d" path="/var/lib/kubelet/pods/8de6f3fa-11a3-4730-8424-47207b77ca2d/volumes" Dec 03 14:36:17 crc kubenswrapper[5004]: I1203 14:36:17.628278 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22aca07-9901-4532-8e36-e4ef14be0a26" path="/var/lib/kubelet/pods/f22aca07-9901-4532-8e36-e4ef14be0a26/volumes" Dec 03 14:36:20 crc kubenswrapper[5004]: I1203 14:36:20.620340 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:36:20 crc kubenswrapper[5004]: E1203 14:36:20.622048 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:36:23 crc kubenswrapper[5004]: I1203 14:36:23.032322 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lnxkp"] Dec 03 14:36:23 crc kubenswrapper[5004]: I1203 14:36:23.040731 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lnxkp"] Dec 03 14:36:23 crc kubenswrapper[5004]: I1203 14:36:23.626609 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76bed8f2-08f1-41f0-beb0-d0a2ded315bf" path="/var/lib/kubelet/pods/76bed8f2-08f1-41f0-beb0-d0a2ded315bf/volumes" Dec 03 14:36:25 crc kubenswrapper[5004]: I1203 14:36:25.028944 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-gq5tr"] Dec 03 14:36:25 crc kubenswrapper[5004]: I1203 14:36:25.041379 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-gq5tr"] Dec 03 14:36:25 crc kubenswrapper[5004]: I1203 14:36:25.623783 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa5e91c-186c-419f-b6a3-95c486ff267d" path="/var/lib/kubelet/pods/7aa5e91c-186c-419f-b6a3-95c486ff267d/volumes" Dec 03 14:36:34 crc kubenswrapper[5004]: I1203 14:36:34.612892 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:36:34 crc kubenswrapper[5004]: E1203 14:36:34.613649 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.288298 5004 scope.go:117] "RemoveContainer" containerID="1e78886f451ae9e3e253ae331d091b6ff038e999da4076a143dfea51c324dc76" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.313084 5004 scope.go:117] "RemoveContainer" containerID="59e34967bc80036bb597335aba32998347f1ccb617da5da2b6ca195c41f328b6" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.360194 5004 scope.go:117] "RemoveContainer" containerID="342e97dddeda55b10502a5b49f567b58e80d1673f2cdffad657c2d2b0905228b" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.410830 5004 scope.go:117] "RemoveContainer" containerID="db43822ea38c33ddc82bfbab2e0e49d58b62465f107c9874c0574f913c623746" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.464229 5004 scope.go:117] "RemoveContainer" containerID="b84ec5ab5f7aa76f2683c06ad97380104713eaa7ec6767fd2db29e2aa4c86697" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.510109 5004 scope.go:117] "RemoveContainer" containerID="e1ec5dc0002d8c342388f403b03d9bfbf915d1efa0335bddc020897fa20bf841" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.537520 5004 scope.go:117] "RemoveContainer" containerID="40ef99f63c347aa5849e1574a9221ee9d757da0db9ea62343531ea6338240815" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.569289 5004 scope.go:117] "RemoveContainer" containerID="3ccaa28ad38a276cedacc6bddedd2c59654a79661c31faa4bfe4e328a8dac899" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.589707 5004 scope.go:117] "RemoveContainer" containerID="37d1103f002984f67aaf6772b4a0a58a8d130bfc087a7955ec35122714950641" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.609686 5004 scope.go:117] "RemoveContainer" containerID="18f383dbc9f667005300b9f4adfb832d43f150cfc1d8c5b5e7f6775b0632d249" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.639047 5004 scope.go:117] "RemoveContainer" containerID="0717f1a84b54ca455ede6730c25f68eccb072f25095deb51eac907caf59c8d66" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.660089 5004 scope.go:117] "RemoveContainer" containerID="b947b7d25ec0c5ca1d7a5ddac28aeaded751ac4c3d58a6ebc3eea09fc47c6b92" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.678873 5004 scope.go:117] "RemoveContainer" containerID="360e87564925cc1ed866f6a965f4c597ad769bf2dc578c168aae01d9c3c8406b" Dec 03 14:36:35 crc kubenswrapper[5004]: I1203 14:36:35.699521 5004 scope.go:117] "RemoveContainer" containerID="a582e484dbb2a3817dbe16f7da8e11ec2bb058999140ca903ce779db3b968118" Dec 03 14:36:46 crc kubenswrapper[5004]: I1203 14:36:46.613272 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:36:46 crc kubenswrapper[5004]: E1203 14:36:46.614159 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:37:01 crc kubenswrapper[5004]: I1203 14:37:01.612965 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:37:01 crc kubenswrapper[5004]: E1203 14:37:01.613757 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:37:08 crc kubenswrapper[5004]: I1203 14:37:08.041851 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-c9xfn"] Dec 03 14:37:08 crc kubenswrapper[5004]: I1203 14:37:08.051435 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-c9xfn"] Dec 03 14:37:09 crc kubenswrapper[5004]: I1203 14:37:09.633299 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d38e9d-5aea-4c66-8c17-cc31d9494116" path="/var/lib/kubelet/pods/84d38e9d-5aea-4c66-8c17-cc31d9494116/volumes" Dec 03 14:37:10 crc kubenswrapper[5004]: I1203 14:37:10.040621 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8csk5"] Dec 03 14:37:10 crc kubenswrapper[5004]: I1203 14:37:10.054917 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8csk5"] Dec 03 14:37:11 crc kubenswrapper[5004]: I1203 14:37:11.626323 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb30bfb-060c-4c8f-aab0-6ca0befc83d6" path="/var/lib/kubelet/pods/7eb30bfb-060c-4c8f-aab0-6ca0befc83d6/volumes" Dec 03 14:37:12 crc kubenswrapper[5004]: I1203 14:37:12.613495 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:37:12 crc kubenswrapper[5004]: E1203 14:37:12.614534 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:37:15 crc kubenswrapper[5004]: I1203 14:37:15.415366 5004 generic.go:334] "Generic (PLEG): container finished" podID="faf69ec7-959a-404b-9bae-24bc3c528c28" containerID="fb7b9cb621267933efc646181a72cef1e37bd14ce4dc6152e7651fc177a948b2" exitCode=0 Dec 03 14:37:15 crc kubenswrapper[5004]: I1203 14:37:15.415431 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" event={"ID":"faf69ec7-959a-404b-9bae-24bc3c528c28","Type":"ContainerDied","Data":"fb7b9cb621267933efc646181a72cef1e37bd14ce4dc6152e7651fc177a948b2"} Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.845717 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.952470 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory\") pod \"faf69ec7-959a-404b-9bae-24bc3c528c28\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.952652 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key\") pod \"faf69ec7-959a-404b-9bae-24bc3c528c28\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.952769 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mbq\" (UniqueName: \"kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq\") pod \"faf69ec7-959a-404b-9bae-24bc3c528c28\" (UID: \"faf69ec7-959a-404b-9bae-24bc3c528c28\") " Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.957729 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq" (OuterVolumeSpecName: "kube-api-access-78mbq") pod "faf69ec7-959a-404b-9bae-24bc3c528c28" (UID: "faf69ec7-959a-404b-9bae-24bc3c528c28"). InnerVolumeSpecName "kube-api-access-78mbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.982064 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory" (OuterVolumeSpecName: "inventory") pod "faf69ec7-959a-404b-9bae-24bc3c528c28" (UID: "faf69ec7-959a-404b-9bae-24bc3c528c28"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:16 crc kubenswrapper[5004]: I1203 14:37:16.999301 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "faf69ec7-959a-404b-9bae-24bc3c528c28" (UID: "faf69ec7-959a-404b-9bae-24bc3c528c28"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.045190 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xpj88"] Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.054871 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.054910 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faf69ec7-959a-404b-9bae-24bc3c528c28-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.054921 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mbq\" (UniqueName: \"kubernetes.io/projected/faf69ec7-959a-404b-9bae-24bc3c528c28-kube-api-access-78mbq\") on node \"crc\" DevicePath \"\"" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.057135 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xpj88"] Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.431717 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" event={"ID":"faf69ec7-959a-404b-9bae-24bc3c528c28","Type":"ContainerDied","Data":"e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623"} Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.431765 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5dc912d13a6acc144585b3b720024856c08ec47c5ed144384d3074c33871623" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.432131 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.516199 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv"] Dec 03 14:37:17 crc kubenswrapper[5004]: E1203 14:37:17.516714 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf69ec7-959a-404b-9bae-24bc3c528c28" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.516741 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf69ec7-959a-404b-9bae-24bc3c528c28" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.516969 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf69ec7-959a-404b-9bae-24bc3c528c28" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.518479 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.520918 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.520981 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.520995 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.521142 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.526545 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv"] Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.565177 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.565398 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgjz\" (UniqueName: \"kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.565769 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.624484 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124d03f5-14a3-430b-acb8-4a2b2fb79d37" path="/var/lib/kubelet/pods/124d03f5-14a3-430b-acb8-4a2b2fb79d37/volumes" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.668952 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crgjz\" (UniqueName: \"kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.669113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.669171 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.673905 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.673928 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.690323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgjz\" (UniqueName: \"kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:17 crc kubenswrapper[5004]: I1203 14:37:17.834717 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:37:18 crc kubenswrapper[5004]: I1203 14:37:18.558902 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv"] Dec 03 14:37:19 crc kubenswrapper[5004]: I1203 14:37:19.449904 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" event={"ID":"632aa0c1-b525-45af-8254-2f0f0dc57c43","Type":"ContainerStarted","Data":"8b6cbbfe779c8d5ef4b5b50a2cced874961293bd429ec2cde6c5910742bec0b3"} Dec 03 14:37:20 crc kubenswrapper[5004]: I1203 14:37:20.459720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" event={"ID":"632aa0c1-b525-45af-8254-2f0f0dc57c43","Type":"ContainerStarted","Data":"8c54b658f68067671dcb0f11f7cc7e2772ea8ea08ba10a7a65a016e4608f2bab"} Dec 03 14:37:20 crc kubenswrapper[5004]: I1203 14:37:20.482170 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" podStartSLOduration=2.825924465 podStartE2EDuration="3.482153984s" podCreationTimestamp="2025-12-03 14:37:17 +0000 UTC" firstStartedPulling="2025-12-03 14:37:18.638115085 +0000 UTC m=+1851.387085321" lastFinishedPulling="2025-12-03 14:37:19.294344604 +0000 UTC m=+1852.043314840" observedRunningTime="2025-12-03 14:37:20.479493277 +0000 UTC m=+1853.228463513" watchObservedRunningTime="2025-12-03 14:37:20.482153984 +0000 UTC m=+1853.231124220" Dec 03 14:37:23 crc kubenswrapper[5004]: I1203 14:37:23.613550 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:37:23 crc kubenswrapper[5004]: E1203 14:37:23.615183 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:37:26 crc kubenswrapper[5004]: I1203 14:37:26.032793 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-646bh"] Dec 03 14:37:26 crc kubenswrapper[5004]: I1203 14:37:26.041449 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-646bh"] Dec 03 14:37:27 crc kubenswrapper[5004]: I1203 14:37:27.631744 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52" path="/var/lib/kubelet/pods/c2f9a1da-0aa3-42d1-83f4-9e91b75f8d52/volumes" Dec 03 14:37:28 crc kubenswrapper[5004]: I1203 14:37:28.028525 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wv94f"] Dec 03 14:37:28 crc kubenswrapper[5004]: I1203 14:37:28.036778 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wv94f"] Dec 03 14:37:29 crc kubenswrapper[5004]: I1203 14:37:29.626079 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82059d63-43a0-43ed-b9ea-9c54f700a2dc" path="/var/lib/kubelet/pods/82059d63-43a0-43ed-b9ea-9c54f700a2dc/volumes" Dec 03 14:37:35 crc kubenswrapper[5004]: I1203 14:37:35.921104 5004 scope.go:117] "RemoveContainer" containerID="7868aa0b8066bfecd1cb2ca83f6c3f89c7491195180e8311bc7539b3977d899e" Dec 03 14:37:35 crc kubenswrapper[5004]: I1203 14:37:35.982647 5004 scope.go:117] "RemoveContainer" containerID="3a2d6e5ffe7db56bc7dbd8e07419385f55f9b50e5496296af21e05436f870d95" Dec 03 14:37:36 crc kubenswrapper[5004]: I1203 14:37:36.006798 5004 scope.go:117] "RemoveContainer" containerID="e3762ab41b089bbb2b654cb032992671bc7c5239579143c08ef4e7476721e67f" Dec 03 14:37:36 crc kubenswrapper[5004]: I1203 14:37:36.091146 5004 scope.go:117] "RemoveContainer" containerID="29531f1ce2a7659a3479164bbf9895db81399c25c170f8e20b074c4f392b167c" Dec 03 14:37:36 crc kubenswrapper[5004]: I1203 14:37:36.138325 5004 scope.go:117] "RemoveContainer" containerID="d2b99edabb08722467a07ede584711b4db2a567eb0c271df6d45257baa7fcb24" Dec 03 14:37:38 crc kubenswrapper[5004]: I1203 14:37:38.612317 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:37:38 crc kubenswrapper[5004]: E1203 14:37:38.612817 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:37:50 crc kubenswrapper[5004]: I1203 14:37:50.613300 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:37:50 crc kubenswrapper[5004]: E1203 14:37:50.614484 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:02 crc kubenswrapper[5004]: I1203 14:38:02.613120 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:38:02 crc kubenswrapper[5004]: E1203 14:38:02.613805 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:12 crc kubenswrapper[5004]: I1203 14:38:12.062494 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gl6js"] Dec 03 14:38:12 crc kubenswrapper[5004]: I1203 14:38:12.079109 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gl6js"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.039125 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pwhkz"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.049218 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6611-account-create-update-vs8jg"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.063222 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-g7df6"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.072849 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a3f9-account-create-update-j96b4"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.081932 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4b76-account-create-update-v5798"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.091669 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pwhkz"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.101295 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6611-account-create-update-vs8jg"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.109715 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4b76-account-create-update-v5798"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.119204 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a3f9-account-create-update-j96b4"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.128667 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-g7df6"] Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.614094 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:38:13 crc kubenswrapper[5004]: E1203 14:38:13.614296 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.625478 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e053df-7662-47f4-bd6d-ed3f75ca1901" path="/var/lib/kubelet/pods/16e053df-7662-47f4-bd6d-ed3f75ca1901/volumes" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.626433 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57216442-7799-4751-8116-ba7d842d4be9" path="/var/lib/kubelet/pods/57216442-7799-4751-8116-ba7d842d4be9/volumes" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.627120 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804a1196-3d3c-4e15-8a2c-3ae12a943249" path="/var/lib/kubelet/pods/804a1196-3d3c-4e15-8a2c-3ae12a943249/volumes" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.627711 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b84bb98-6817-47a4-8d88-dfeb6a19e195" path="/var/lib/kubelet/pods/9b84bb98-6817-47a4-8d88-dfeb6a19e195/volumes" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.628773 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df3e0e7-8b8b-4f47-8b19-40afbab582d6" path="/var/lib/kubelet/pods/9df3e0e7-8b8b-4f47-8b19-40afbab582d6/volumes" Dec 03 14:38:13 crc kubenswrapper[5004]: I1203 14:38:13.629405 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2587cd-e48f-400d-b782-04f2c573862a" path="/var/lib/kubelet/pods/ed2587cd-e48f-400d-b782-04f2c573862a/volumes" Dec 03 14:38:24 crc kubenswrapper[5004]: I1203 14:38:24.613399 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:38:24 crc kubenswrapper[5004]: E1203 14:38:24.614334 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:30 crc kubenswrapper[5004]: I1203 14:38:30.100785 5004 generic.go:334] "Generic (PLEG): container finished" podID="632aa0c1-b525-45af-8254-2f0f0dc57c43" containerID="8c54b658f68067671dcb0f11f7cc7e2772ea8ea08ba10a7a65a016e4608f2bab" exitCode=0 Dec 03 14:38:30 crc kubenswrapper[5004]: I1203 14:38:30.100981 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" event={"ID":"632aa0c1-b525-45af-8254-2f0f0dc57c43","Type":"ContainerDied","Data":"8c54b658f68067671dcb0f11f7cc7e2772ea8ea08ba10a7a65a016e4608f2bab"} Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.539233 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.605412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory\") pod \"632aa0c1-b525-45af-8254-2f0f0dc57c43\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.605779 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crgjz\" (UniqueName: \"kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz\") pod \"632aa0c1-b525-45af-8254-2f0f0dc57c43\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.605919 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key\") pod \"632aa0c1-b525-45af-8254-2f0f0dc57c43\" (UID: \"632aa0c1-b525-45af-8254-2f0f0dc57c43\") " Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.611171 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz" (OuterVolumeSpecName: "kube-api-access-crgjz") pod "632aa0c1-b525-45af-8254-2f0f0dc57c43" (UID: "632aa0c1-b525-45af-8254-2f0f0dc57c43"). InnerVolumeSpecName "kube-api-access-crgjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.640597 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory" (OuterVolumeSpecName: "inventory") pod "632aa0c1-b525-45af-8254-2f0f0dc57c43" (UID: "632aa0c1-b525-45af-8254-2f0f0dc57c43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.640967 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "632aa0c1-b525-45af-8254-2f0f0dc57c43" (UID: "632aa0c1-b525-45af-8254-2f0f0dc57c43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.707631 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.707885 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crgjz\" (UniqueName: \"kubernetes.io/projected/632aa0c1-b525-45af-8254-2f0f0dc57c43-kube-api-access-crgjz\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:31 crc kubenswrapper[5004]: I1203 14:38:31.707960 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/632aa0c1-b525-45af-8254-2f0f0dc57c43-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.124846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" event={"ID":"632aa0c1-b525-45af-8254-2f0f0dc57c43","Type":"ContainerDied","Data":"8b6cbbfe779c8d5ef4b5b50a2cced874961293bd429ec2cde6c5910742bec0b3"} Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.124915 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6cbbfe779c8d5ef4b5b50a2cced874961293bd429ec2cde6c5910742bec0b3" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.124932 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.209813 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2"] Dec 03 14:38:32 crc kubenswrapper[5004]: E1203 14:38:32.210248 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632aa0c1-b525-45af-8254-2f0f0dc57c43" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.210272 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="632aa0c1-b525-45af-8254-2f0f0dc57c43" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.210477 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="632aa0c1-b525-45af-8254-2f0f0dc57c43" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.212439 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.218831 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.218975 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.218997 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.219768 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.247962 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2"] Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.318314 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87h9\" (UniqueName: \"kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.318367 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.318534 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.419848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87h9\" (UniqueName: \"kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.419959 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.420096 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.424135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.436102 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.442053 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87h9\" (UniqueName: \"kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kftm2\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:32 crc kubenswrapper[5004]: I1203 14:38:32.538656 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:33 crc kubenswrapper[5004]: I1203 14:38:33.105757 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2"] Dec 03 14:38:33 crc kubenswrapper[5004]: I1203 14:38:33.134125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" event={"ID":"8f5d5c71-22c1-4bd4-a95d-8865928a48c3","Type":"ContainerStarted","Data":"e3768ef958db38af7ded31a8518a92ca0c13dd0339453e9d73e342b0b49dea44"} Dec 03 14:38:34 crc kubenswrapper[5004]: I1203 14:38:34.146589 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" event={"ID":"8f5d5c71-22c1-4bd4-a95d-8865928a48c3","Type":"ContainerStarted","Data":"469b537866c2c5c4483778e193d90730ecd2579d92de3376fce18a9e8bc0daf1"} Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.271081 5004 scope.go:117] "RemoveContainer" containerID="0e467b3d99b383fffebd71c6ff56d045e9ac3e107a0c125f3f1d4893728253f5" Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.664633 5004 scope.go:117] "RemoveContainer" containerID="b7fde29958f276d04f31bea0fde651a061cd6328195f0f202053bc8f3874f3f1" Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.709840 5004 scope.go:117] "RemoveContainer" containerID="b58f8a8ee5cb386d03a7e711fe2171ccb92f54f4232d8b5ee394f908b718522d" Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.752660 5004 scope.go:117] "RemoveContainer" containerID="0957ac13149aa15a69ae904702bc69f4ed5ec77d67e963fca236bd6c4226da6c" Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.862608 5004 scope.go:117] "RemoveContainer" containerID="fafc18bfee759b2a197049db6386283346b941a560625fb31f1fc8819e9d4025" Dec 03 14:38:36 crc kubenswrapper[5004]: I1203 14:38:36.887274 5004 scope.go:117] "RemoveContainer" containerID="d9e43b9bf20d31643d65208e19b5996b5a3a0a10177f52588dc76f0f50ee65bb" Dec 03 14:38:39 crc kubenswrapper[5004]: I1203 14:38:39.198913 5004 generic.go:334] "Generic (PLEG): container finished" podID="8f5d5c71-22c1-4bd4-a95d-8865928a48c3" containerID="469b537866c2c5c4483778e193d90730ecd2579d92de3376fce18a9e8bc0daf1" exitCode=0 Dec 03 14:38:39 crc kubenswrapper[5004]: I1203 14:38:39.199638 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" event={"ID":"8f5d5c71-22c1-4bd4-a95d-8865928a48c3","Type":"ContainerDied","Data":"469b537866c2c5c4483778e193d90730ecd2579d92de3376fce18a9e8bc0daf1"} Dec 03 14:38:39 crc kubenswrapper[5004]: I1203 14:38:39.613209 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:38:39 crc kubenswrapper[5004]: E1203 14:38:39.613617 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.626008 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.787297 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87h9\" (UniqueName: \"kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9\") pod \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.787411 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key\") pod \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.787475 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory\") pod \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\" (UID: \"8f5d5c71-22c1-4bd4-a95d-8865928a48c3\") " Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.795256 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9" (OuterVolumeSpecName: "kube-api-access-d87h9") pod "8f5d5c71-22c1-4bd4-a95d-8865928a48c3" (UID: "8f5d5c71-22c1-4bd4-a95d-8865928a48c3"). InnerVolumeSpecName "kube-api-access-d87h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.823527 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory" (OuterVolumeSpecName: "inventory") pod "8f5d5c71-22c1-4bd4-a95d-8865928a48c3" (UID: "8f5d5c71-22c1-4bd4-a95d-8865928a48c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.831394 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f5d5c71-22c1-4bd4-a95d-8865928a48c3" (UID: "8f5d5c71-22c1-4bd4-a95d-8865928a48c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.891131 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.891180 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87h9\" (UniqueName: \"kubernetes.io/projected/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-kube-api-access-d87h9\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:40 crc kubenswrapper[5004]: I1203 14:38:40.891195 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f5d5c71-22c1-4bd4-a95d-8865928a48c3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.222721 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" event={"ID":"8f5d5c71-22c1-4bd4-a95d-8865928a48c3","Type":"ContainerDied","Data":"e3768ef958db38af7ded31a8518a92ca0c13dd0339453e9d73e342b0b49dea44"} Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.222780 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kftm2" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.222780 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3768ef958db38af7ded31a8518a92ca0c13dd0339453e9d73e342b0b49dea44" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.364421 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8"] Dec 03 14:38:41 crc kubenswrapper[5004]: E1203 14:38:41.364994 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d5c71-22c1-4bd4-a95d-8865928a48c3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.365027 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d5c71-22c1-4bd4-a95d-8865928a48c3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.365388 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d5c71-22c1-4bd4-a95d-8865928a48c3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.366459 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.370289 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.370545 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.370953 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.371819 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.382823 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8"] Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.507250 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.507350 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.508572 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntnt7\" (UniqueName: \"kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.610746 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.610969 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.611133 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntnt7\" (UniqueName: \"kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.616500 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.616599 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.630667 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntnt7\" (UniqueName: \"kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9ggz8\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:41 crc kubenswrapper[5004]: I1203 14:38:41.696204 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:38:42 crc kubenswrapper[5004]: I1203 14:38:42.355081 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8"] Dec 03 14:38:43 crc kubenswrapper[5004]: I1203 14:38:43.245060 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" event={"ID":"5de0359a-b8f8-4989-8739-ee565cd596fe","Type":"ContainerStarted","Data":"b360f02ff83251af3ef2981e4f5e5a287f8251694b4492bac8f9f1ef98112a4c"} Dec 03 14:38:45 crc kubenswrapper[5004]: I1203 14:38:45.274045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" event={"ID":"5de0359a-b8f8-4989-8739-ee565cd596fe","Type":"ContainerStarted","Data":"5c8c3e014d485e4c9e79b8c501a5ce83ecb354557975ddfbcca38b1e022ffe01"} Dec 03 14:38:45 crc kubenswrapper[5004]: I1203 14:38:45.303367 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" podStartSLOduration=1.958066819 podStartE2EDuration="4.303340085s" podCreationTimestamp="2025-12-03 14:38:41 +0000 UTC" firstStartedPulling="2025-12-03 14:38:42.367122615 +0000 UTC m=+1935.116092871" lastFinishedPulling="2025-12-03 14:38:44.712395901 +0000 UTC m=+1937.461366137" observedRunningTime="2025-12-03 14:38:45.292166145 +0000 UTC m=+1938.041136381" watchObservedRunningTime="2025-12-03 14:38:45.303340085 +0000 UTC m=+1938.052310321" Dec 03 14:38:50 crc kubenswrapper[5004]: I1203 14:38:50.613277 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:38:50 crc kubenswrapper[5004]: E1203 14:38:50.614159 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:38:57 crc kubenswrapper[5004]: I1203 14:38:57.038199 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9twz"] Dec 03 14:38:57 crc kubenswrapper[5004]: I1203 14:38:57.047421 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9twz"] Dec 03 14:38:57 crc kubenswrapper[5004]: I1203 14:38:57.631273 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2193ca31-82a6-4591-ae77-79ffa853b938" path="/var/lib/kubelet/pods/2193ca31-82a6-4591-ae77-79ffa853b938/volumes" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.613737 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:39:02 crc kubenswrapper[5004]: E1203 14:39:02.614497 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.619999 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.623557 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.632847 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.632924 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.632969 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnqk\" (UniqueName: \"kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.633315 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.735435 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnqk\" (UniqueName: \"kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.735606 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.735631 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.736106 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.736213 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.761840 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnqk\" (UniqueName: \"kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk\") pod \"redhat-operators-r8wqg\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:02 crc kubenswrapper[5004]: I1203 14:39:02.983771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:03 crc kubenswrapper[5004]: I1203 14:39:03.444746 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:04 crc kubenswrapper[5004]: I1203 14:39:04.453112 5004 generic.go:334] "Generic (PLEG): container finished" podID="a21c70ce-4274-41b2-8856-1d97004089a4" containerID="2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b" exitCode=0 Dec 03 14:39:04 crc kubenswrapper[5004]: I1203 14:39:04.453214 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerDied","Data":"2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b"} Dec 03 14:39:04 crc kubenswrapper[5004]: I1203 14:39:04.453463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerStarted","Data":"e6c91347737c10d8a1a00a4e0e792c5cd620467e43652cc92b60274b2033f477"} Dec 03 14:39:06 crc kubenswrapper[5004]: I1203 14:39:06.475118 5004 generic.go:334] "Generic (PLEG): container finished" podID="a21c70ce-4274-41b2-8856-1d97004089a4" containerID="9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c" exitCode=0 Dec 03 14:39:06 crc kubenswrapper[5004]: I1203 14:39:06.475209 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerDied","Data":"9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c"} Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.113690 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.116887 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.157022 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.189173 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.189257 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6589x\" (UniqueName: \"kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.189284 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.290697 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.290756 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6589x\" (UniqueName: \"kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.290778 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.291214 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.291439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.309899 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6589x\" (UniqueName: \"kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x\") pod \"community-operators-brlb6\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.444510 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.560564 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerStarted","Data":"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3"} Dec 03 14:39:14 crc kubenswrapper[5004]: I1203 14:39:14.620900 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:39:14 crc kubenswrapper[5004]: E1203 14:39:14.621354 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:39:15 crc kubenswrapper[5004]: I1203 14:39:15.060396 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:15 crc kubenswrapper[5004]: W1203 14:39:15.382154 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463 WatchSource:0}: Error finding container 538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463: Status 404 returned error can't find the container with id 538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463 Dec 03 14:39:15 crc kubenswrapper[5004]: I1203 14:39:15.575300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerStarted","Data":"538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463"} Dec 03 14:39:15 crc kubenswrapper[5004]: I1203 14:39:15.593827 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8wqg" podStartSLOduration=10.52124729 podStartE2EDuration="13.593807059s" podCreationTimestamp="2025-12-03 14:39:02 +0000 UTC" firstStartedPulling="2025-12-03 14:39:04.454894884 +0000 UTC m=+1957.203865130" lastFinishedPulling="2025-12-03 14:39:07.527454663 +0000 UTC m=+1960.276424899" observedRunningTime="2025-12-03 14:39:15.592392849 +0000 UTC m=+1968.341363115" watchObservedRunningTime="2025-12-03 14:39:15.593807059 +0000 UTC m=+1968.342777295" Dec 03 14:39:16 crc kubenswrapper[5004]: I1203 14:39:16.584413 5004 generic.go:334] "Generic (PLEG): container finished" podID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerID="9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966" exitCode=0 Dec 03 14:39:16 crc kubenswrapper[5004]: I1203 14:39:16.584498 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerDied","Data":"9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966"} Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.045981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wb76m"] Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.053852 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-79vfv"] Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.062313 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wb76m"] Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.071997 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-79vfv"] Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.629273 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b52e9683-d017-4a09-a6fa-5377df5032e1" path="/var/lib/kubelet/pods/b52e9683-d017-4a09-a6fa-5377df5032e1/volumes" Dec 03 14:39:17 crc kubenswrapper[5004]: I1203 14:39:17.629845 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf" path="/var/lib/kubelet/pods/d05c2eb8-8c86-4ca7-b26a-29d57f1ceccf/volumes" Dec 03 14:39:18 crc kubenswrapper[5004]: I1203 14:39:18.606976 5004 generic.go:334] "Generic (PLEG): container finished" podID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerID="1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960" exitCode=0 Dec 03 14:39:18 crc kubenswrapper[5004]: I1203 14:39:18.607029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerDied","Data":"1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960"} Dec 03 14:39:19 crc kubenswrapper[5004]: I1203 14:39:19.624385 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerStarted","Data":"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569"} Dec 03 14:39:22 crc kubenswrapper[5004]: I1203 14:39:22.984390 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:22 crc kubenswrapper[5004]: I1203 14:39:22.985954 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:23 crc kubenswrapper[5004]: I1203 14:39:23.047981 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:23 crc kubenswrapper[5004]: I1203 14:39:23.080509 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brlb6" podStartSLOduration=6.635491255 podStartE2EDuration="9.080478028s" podCreationTimestamp="2025-12-03 14:39:14 +0000 UTC" firstStartedPulling="2025-12-03 14:39:16.587340001 +0000 UTC m=+1969.336310227" lastFinishedPulling="2025-12-03 14:39:19.032326764 +0000 UTC m=+1971.781297000" observedRunningTime="2025-12-03 14:39:19.64010075 +0000 UTC m=+1972.389070986" watchObservedRunningTime="2025-12-03 14:39:23.080478028 +0000 UTC m=+1975.829448304" Dec 03 14:39:23 crc kubenswrapper[5004]: I1203 14:39:23.701741 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:23 crc kubenswrapper[5004]: I1203 14:39:23.745487 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.445378 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.445772 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.492276 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.681658 5004 generic.go:334] "Generic (PLEG): container finished" podID="5de0359a-b8f8-4989-8739-ee565cd596fe" containerID="5c8c3e014d485e4c9e79b8c501a5ce83ecb354557975ddfbcca38b1e022ffe01" exitCode=0 Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.681768 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" event={"ID":"5de0359a-b8f8-4989-8739-ee565cd596fe","Type":"ContainerDied","Data":"5c8c3e014d485e4c9e79b8c501a5ce83ecb354557975ddfbcca38b1e022ffe01"} Dec 03 14:39:24 crc kubenswrapper[5004]: I1203 14:39:24.730994 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:25 crc kubenswrapper[5004]: I1203 14:39:25.613919 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:39:25 crc kubenswrapper[5004]: I1203 14:39:25.705012 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8wqg" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="registry-server" containerID="cri-o://56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3" gracePeriod=2 Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.108872 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.161450 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.340842 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntnt7\" (UniqueName: \"kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7\") pod \"5de0359a-b8f8-4989-8739-ee565cd596fe\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.340921 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key\") pod \"5de0359a-b8f8-4989-8739-ee565cd596fe\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.341059 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory\") pod \"5de0359a-b8f8-4989-8739-ee565cd596fe\" (UID: \"5de0359a-b8f8-4989-8739-ee565cd596fe\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.348305 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7" (OuterVolumeSpecName: "kube-api-access-ntnt7") pod "5de0359a-b8f8-4989-8739-ee565cd596fe" (UID: "5de0359a-b8f8-4989-8739-ee565cd596fe"). InnerVolumeSpecName "kube-api-access-ntnt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.372945 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5de0359a-b8f8-4989-8739-ee565cd596fe" (UID: "5de0359a-b8f8-4989-8739-ee565cd596fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.373796 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory" (OuterVolumeSpecName: "inventory") pod "5de0359a-b8f8-4989-8739-ee565cd596fe" (UID: "5de0359a-b8f8-4989-8739-ee565cd596fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.443647 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.443705 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntnt7\" (UniqueName: \"kubernetes.io/projected/5de0359a-b8f8-4989-8739-ee565cd596fe-kube-api-access-ntnt7\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.443718 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5de0359a-b8f8-4989-8739-ee565cd596fe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.620428 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.714258 5004 generic.go:334] "Generic (PLEG): container finished" podID="a21c70ce-4274-41b2-8856-1d97004089a4" containerID="56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3" exitCode=0 Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.714561 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerDied","Data":"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3"} Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.714591 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8wqg" event={"ID":"a21c70ce-4274-41b2-8856-1d97004089a4","Type":"ContainerDied","Data":"e6c91347737c10d8a1a00a4e0e792c5cd620467e43652cc92b60274b2033f477"} Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.714611 5004 scope.go:117] "RemoveContainer" containerID="56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.714746 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8wqg" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.718072 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29"} Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.724527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" event={"ID":"5de0359a-b8f8-4989-8739-ee565cd596fe","Type":"ContainerDied","Data":"b360f02ff83251af3ef2981e4f5e5a287f8251694b4492bac8f9f1ef98112a4c"} Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.724707 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b360f02ff83251af3ef2981e4f5e5a287f8251694b4492bac8f9f1ef98112a4c" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.724547 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9ggz8" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.724692 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brlb6" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="registry-server" containerID="cri-o://bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569" gracePeriod=2 Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.749299 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnqk\" (UniqueName: \"kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk\") pod \"a21c70ce-4274-41b2-8856-1d97004089a4\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.749393 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content\") pod \"a21c70ce-4274-41b2-8856-1d97004089a4\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.753968 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities" (OuterVolumeSpecName: "utilities") pod "a21c70ce-4274-41b2-8856-1d97004089a4" (UID: "a21c70ce-4274-41b2-8856-1d97004089a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.754955 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities\") pod \"a21c70ce-4274-41b2-8856-1d97004089a4\" (UID: \"a21c70ce-4274-41b2-8856-1d97004089a4\") " Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.757361 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.764782 5004 scope.go:117] "RemoveContainer" containerID="9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.770458 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk" (OuterVolumeSpecName: "kube-api-access-kqnqk") pod "a21c70ce-4274-41b2-8856-1d97004089a4" (UID: "a21c70ce-4274-41b2-8856-1d97004089a4"). InnerVolumeSpecName "kube-api-access-kqnqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.826608 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng"] Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.827120 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="registry-server" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827145 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="registry-server" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.827173 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de0359a-b8f8-4989-8739-ee565cd596fe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827184 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de0359a-b8f8-4989-8739-ee565cd596fe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.827204 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="extract-content" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827212 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="extract-content" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.827246 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="extract-utilities" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827258 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="extract-utilities" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827550 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" containerName="registry-server" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.827579 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de0359a-b8f8-4989-8739-ee565cd596fe" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.828402 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.831623 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.831676 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.831924 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.832707 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.855237 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng"] Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.860055 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnqk\" (UniqueName: \"kubernetes.io/projected/a21c70ce-4274-41b2-8856-1d97004089a4-kube-api-access-kqnqk\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.931284 5004 scope.go:117] "RemoveContainer" containerID="2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.938817 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21c70ce-4274-41b2-8856-1d97004089a4" (UID: "a21c70ce-4274-41b2-8856-1d97004089a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.962066 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.962120 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.962198 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkp4l\" (UniqueName: \"kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.962354 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21c70ce-4274-41b2-8856-1d97004089a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.979730 5004 scope.go:117] "RemoveContainer" containerID="56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.980387 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3\": container with ID starting with 56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3 not found: ID does not exist" containerID="56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.980414 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3"} err="failed to get container status \"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3\": rpc error: code = NotFound desc = could not find container \"56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3\": container with ID starting with 56dd9f687c9781cac55c11413ad76bfe8b2120b30176f51432d877378e7d38b3 not found: ID does not exist" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.980433 5004 scope.go:117] "RemoveContainer" containerID="9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.980736 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c\": container with ID starting with 9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c not found: ID does not exist" containerID="9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.980754 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c"} err="failed to get container status \"9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c\": rpc error: code = NotFound desc = could not find container \"9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c\": container with ID starting with 9dff480ff00d77fada21330a44abe0b45124b2aa307916d30277eac0a5f8ac4c not found: ID does not exist" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.980786 5004 scope.go:117] "RemoveContainer" containerID="2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b" Dec 03 14:39:26 crc kubenswrapper[5004]: E1203 14:39:26.981061 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b\": container with ID starting with 2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b not found: ID does not exist" containerID="2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b" Dec 03 14:39:26 crc kubenswrapper[5004]: I1203 14:39:26.981087 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b"} err="failed to get container status \"2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b\": rpc error: code = NotFound desc = could not find container \"2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b\": container with ID starting with 2b1c33c996a0c9b7cc6ec5a62ee18ff3bd8572fa49300086bce1708bc4f0029b not found: ID does not exist" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.065376 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.065465 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.065578 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkp4l\" (UniqueName: \"kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.075655 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.075737 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.076820 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.086960 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkp4l\" (UniqueName: \"kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqfng\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.090146 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8wqg"] Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.306667 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.327363 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.473681 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content\") pod \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.473906 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6589x\" (UniqueName: \"kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x\") pod \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.473980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities\") pod \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\" (UID: \"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b\") " Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.475373 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities" (OuterVolumeSpecName: "utilities") pod "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" (UID: "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.480025 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x" (OuterVolumeSpecName: "kube-api-access-6589x") pod "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" (UID: "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b"). InnerVolumeSpecName "kube-api-access-6589x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.538396 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" (UID: "4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.577035 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6589x\" (UniqueName: \"kubernetes.io/projected/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-kube-api-access-6589x\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.577091 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.577105 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.626117 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21c70ce-4274-41b2-8856-1d97004089a4" path="/var/lib/kubelet/pods/a21c70ce-4274-41b2-8856-1d97004089a4/volumes" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.738737 5004 generic.go:334] "Generic (PLEG): container finished" podID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerID="bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569" exitCode=0 Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.738793 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerDied","Data":"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569"} Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.738815 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brlb6" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.738838 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brlb6" event={"ID":"4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b","Type":"ContainerDied","Data":"538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463"} Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.738883 5004 scope.go:117] "RemoveContainer" containerID="bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.764310 5004 scope.go:117] "RemoveContainer" containerID="1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.769255 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.777774 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brlb6"] Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.797106 5004 scope.go:117] "RemoveContainer" containerID="9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.833156 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng"] Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.849604 5004 scope.go:117] "RemoveContainer" containerID="bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569" Dec 03 14:39:27 crc kubenswrapper[5004]: E1203 14:39:27.850126 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569\": container with ID starting with bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569 not found: ID does not exist" containerID="bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.850171 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569"} err="failed to get container status \"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569\": rpc error: code = NotFound desc = could not find container \"bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569\": container with ID starting with bb3da7326a66bbcbd7f1cb063c1e28838a220f41f0789e5a74857058fa75c569 not found: ID does not exist" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.850206 5004 scope.go:117] "RemoveContainer" containerID="1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960" Dec 03 14:39:27 crc kubenswrapper[5004]: E1203 14:39:27.850558 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960\": container with ID starting with 1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960 not found: ID does not exist" containerID="1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.850582 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960"} err="failed to get container status \"1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960\": rpc error: code = NotFound desc = could not find container \"1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960\": container with ID starting with 1ee62773ceec7a607b80d27d4f268a9195bf7aea7b7979f7f3c8f95b7bd45960 not found: ID does not exist" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.850596 5004 scope.go:117] "RemoveContainer" containerID="9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966" Dec 03 14:39:27 crc kubenswrapper[5004]: E1203 14:39:27.850810 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966\": container with ID starting with 9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966 not found: ID does not exist" containerID="9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966" Dec 03 14:39:27 crc kubenswrapper[5004]: I1203 14:39:27.850831 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966"} err="failed to get container status \"9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966\": rpc error: code = NotFound desc = could not find container \"9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966\": container with ID starting with 9bba2ee9c5cd3ffc359f27977c5d1888d211f6d7ac2bdfd5ebe1c8bb6baff966 not found: ID does not exist" Dec 03 14:39:27 crc kubenswrapper[5004]: W1203 14:39:27.855393 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652a6191_a7f2_47a8_9f26_48137e58ce1b.slice/crio-16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f WatchSource:0}: Error finding container 16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f: Status 404 returned error can't find the container with id 16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f Dec 03 14:39:28 crc kubenswrapper[5004]: I1203 14:39:28.651775 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:39:28 crc kubenswrapper[5004]: I1203 14:39:28.752771 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" event={"ID":"652a6191-a7f2-47a8-9f26-48137e58ce1b","Type":"ContainerStarted","Data":"16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f"} Dec 03 14:39:29 crc kubenswrapper[5004]: I1203 14:39:29.625199 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" path="/var/lib/kubelet/pods/4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b/volumes" Dec 03 14:39:29 crc kubenswrapper[5004]: I1203 14:39:29.761819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" event={"ID":"652a6191-a7f2-47a8-9f26-48137e58ce1b","Type":"ContainerStarted","Data":"281f68fd35c0dda124c731593b6a0e9cb56bab572c5f1ac09b771a1e289ab894"} Dec 03 14:39:29 crc kubenswrapper[5004]: I1203 14:39:29.785883 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" podStartSLOduration=2.994814993 podStartE2EDuration="3.785844625s" podCreationTimestamp="2025-12-03 14:39:26 +0000 UTC" firstStartedPulling="2025-12-03 14:39:27.858046129 +0000 UTC m=+1980.607016365" lastFinishedPulling="2025-12-03 14:39:28.649075751 +0000 UTC m=+1981.398045997" observedRunningTime="2025-12-03 14:39:29.78046492 +0000 UTC m=+1982.529435166" watchObservedRunningTime="2025-12-03 14:39:29.785844625 +0000 UTC m=+1982.534814861" Dec 03 14:39:36 crc kubenswrapper[5004]: E1203 14:39:36.156735 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:39:37 crc kubenswrapper[5004]: I1203 14:39:37.041966 5004 scope.go:117] "RemoveContainer" containerID="390d9d79e25c674bdb7459b434605e925e68c625da875e49f3881fc6b76d4322" Dec 03 14:39:37 crc kubenswrapper[5004]: I1203 14:39:37.097101 5004 scope.go:117] "RemoveContainer" containerID="6e4f5838998300c39c89d703be54ae1e9136eab8425c56c67f5fca7d96729f25" Dec 03 14:39:37 crc kubenswrapper[5004]: I1203 14:39:37.155227 5004 scope.go:117] "RemoveContainer" containerID="3075704f48da98497ea8b79f58d0f79d33b25f83b5b44cf2d895d489b003c3ac" Dec 03 14:39:46 crc kubenswrapper[5004]: E1203 14:39:46.406970 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache]" Dec 03 14:39:56 crc kubenswrapper[5004]: E1203 14:39:56.662126 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache]" Dec 03 14:40:01 crc kubenswrapper[5004]: I1203 14:40:01.041984 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nqm7s"] Dec 03 14:40:01 crc kubenswrapper[5004]: I1203 14:40:01.050381 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nqm7s"] Dec 03 14:40:01 crc kubenswrapper[5004]: I1203 14:40:01.624787 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cd7387-6950-4af4-9c08-cd702047c728" path="/var/lib/kubelet/pods/f3cd7387-6950-4af4-9c08-cd702047c728/volumes" Dec 03 14:40:06 crc kubenswrapper[5004]: E1203 14:40:06.923332 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache]" Dec 03 14:40:17 crc kubenswrapper[5004]: E1203 14:40:17.177266 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache]" Dec 03 14:40:18 crc kubenswrapper[5004]: I1203 14:40:18.217799 5004 generic.go:334] "Generic (PLEG): container finished" podID="652a6191-a7f2-47a8-9f26-48137e58ce1b" containerID="281f68fd35c0dda124c731593b6a0e9cb56bab572c5f1ac09b771a1e289ab894" exitCode=0 Dec 03 14:40:18 crc kubenswrapper[5004]: I1203 14:40:18.217922 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" event={"ID":"652a6191-a7f2-47a8-9f26-48137e58ce1b","Type":"ContainerDied","Data":"281f68fd35c0dda124c731593b6a0e9cb56bab572c5f1ac09b771a1e289ab894"} Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.742419 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.778061 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key\") pod \"652a6191-a7f2-47a8-9f26-48137e58ce1b\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.778196 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory\") pod \"652a6191-a7f2-47a8-9f26-48137e58ce1b\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.778317 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkp4l\" (UniqueName: \"kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l\") pod \"652a6191-a7f2-47a8-9f26-48137e58ce1b\" (UID: \"652a6191-a7f2-47a8-9f26-48137e58ce1b\") " Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.784504 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l" (OuterVolumeSpecName: "kube-api-access-zkp4l") pod "652a6191-a7f2-47a8-9f26-48137e58ce1b" (UID: "652a6191-a7f2-47a8-9f26-48137e58ce1b"). InnerVolumeSpecName "kube-api-access-zkp4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.806257 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "652a6191-a7f2-47a8-9f26-48137e58ce1b" (UID: "652a6191-a7f2-47a8-9f26-48137e58ce1b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.810380 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory" (OuterVolumeSpecName: "inventory") pod "652a6191-a7f2-47a8-9f26-48137e58ce1b" (UID: "652a6191-a7f2-47a8-9f26-48137e58ce1b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.880034 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.880075 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkp4l\" (UniqueName: \"kubernetes.io/projected/652a6191-a7f2-47a8-9f26-48137e58ce1b-kube-api-access-zkp4l\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:19 crc kubenswrapper[5004]: I1203 14:40:19.880088 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/652a6191-a7f2-47a8-9f26-48137e58ce1b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.235663 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" event={"ID":"652a6191-a7f2-47a8-9f26-48137e58ce1b","Type":"ContainerDied","Data":"16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f"} Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.235738 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16687707d64ee88192d2b8383631b305acc7a34d416a2184b039c286a862424f" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.235687 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqfng" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.335451 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qr7k"] Dec 03 14:40:20 crc kubenswrapper[5004]: E1203 14:40:20.335870 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="extract-utilities" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.335886 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="extract-utilities" Dec 03 14:40:20 crc kubenswrapper[5004]: E1203 14:40:20.335905 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652a6191-a7f2-47a8-9f26-48137e58ce1b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.335912 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="652a6191-a7f2-47a8-9f26-48137e58ce1b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:20 crc kubenswrapper[5004]: E1203 14:40:20.335920 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="extract-content" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.335926 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="extract-content" Dec 03 14:40:20 crc kubenswrapper[5004]: E1203 14:40:20.335937 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="registry-server" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.335943 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="registry-server" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.336155 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdfce07-8a30-4f5a-9ffa-5605d3b1b72b" containerName="registry-server" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.336169 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="652a6191-a7f2-47a8-9f26-48137e58ce1b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.336890 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.339345 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.339477 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.339917 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.340214 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.344344 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qr7k"] Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.389033 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfp5s\" (UniqueName: \"kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.389388 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.389613 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.491750 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfp5s\" (UniqueName: \"kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.491964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.492047 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.496258 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.499251 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.508178 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfp5s\" (UniqueName: \"kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s\") pod \"ssh-known-hosts-edpm-deployment-4qr7k\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:20 crc kubenswrapper[5004]: I1203 14:40:20.658412 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:21 crc kubenswrapper[5004]: I1203 14:40:21.163825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qr7k"] Dec 03 14:40:21 crc kubenswrapper[5004]: W1203 14:40:21.168091 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48017895_b32f_4afa_a7bf_e7e41c29d256.slice/crio-e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342 WatchSource:0}: Error finding container e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342: Status 404 returned error can't find the container with id e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342 Dec 03 14:40:21 crc kubenswrapper[5004]: I1203 14:40:21.244464 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" event={"ID":"48017895-b32f-4afa-a7bf-e7e41c29d256","Type":"ContainerStarted","Data":"e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342"} Dec 03 14:40:24 crc kubenswrapper[5004]: I1203 14:40:24.276904 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" event={"ID":"48017895-b32f-4afa-a7bf-e7e41c29d256","Type":"ContainerStarted","Data":"37d9598dc0c4435d2d9d54881473e738e2c18876d4a36bea24ab2b0ec6ac9f6c"} Dec 03 14:40:25 crc kubenswrapper[5004]: I1203 14:40:25.306356 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" podStartSLOduration=3.443889456 podStartE2EDuration="5.30633613s" podCreationTimestamp="2025-12-03 14:40:20 +0000 UTC" firstStartedPulling="2025-12-03 14:40:21.173817867 +0000 UTC m=+2033.922788103" lastFinishedPulling="2025-12-03 14:40:23.036264501 +0000 UTC m=+2035.785234777" observedRunningTime="2025-12-03 14:40:25.30217444 +0000 UTC m=+2038.051144686" watchObservedRunningTime="2025-12-03 14:40:25.30633613 +0000 UTC m=+2038.055306366" Dec 03 14:40:27 crc kubenswrapper[5004]: E1203 14:40:27.425073 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdfce07_8a30_4f5a_9ffa_5605d3b1b72b.slice/crio-538eacc62005aaf244762bb122c4be1553650f907838d7376bd2e1e9aeb27463\": RecentStats: unable to find data in memory cache]" Dec 03 14:40:27 crc kubenswrapper[5004]: E1203 14:40:27.649646 5004 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/c3342f25aee59ad23741af58b8eca3022266403b71ed5ea12ffd67600fb4039e/diff" to get inode usage: stat /var/lib/containers/storage/overlay/c3342f25aee59ad23741af58b8eca3022266403b71ed5ea12ffd67600fb4039e/diff: no such file or directory, extraDiskErr: Dec 03 14:40:32 crc kubenswrapper[5004]: I1203 14:40:32.361087 5004 generic.go:334] "Generic (PLEG): container finished" podID="48017895-b32f-4afa-a7bf-e7e41c29d256" containerID="37d9598dc0c4435d2d9d54881473e738e2c18876d4a36bea24ab2b0ec6ac9f6c" exitCode=0 Dec 03 14:40:32 crc kubenswrapper[5004]: I1203 14:40:32.361260 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" event={"ID":"48017895-b32f-4afa-a7bf-e7e41c29d256","Type":"ContainerDied","Data":"37d9598dc0c4435d2d9d54881473e738e2c18876d4a36bea24ab2b0ec6ac9f6c"} Dec 03 14:40:33 crc kubenswrapper[5004]: I1203 14:40:33.880374 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:33 crc kubenswrapper[5004]: I1203 14:40:33.983898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfp5s\" (UniqueName: \"kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s\") pod \"48017895-b32f-4afa-a7bf-e7e41c29d256\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " Dec 03 14:40:33 crc kubenswrapper[5004]: I1203 14:40:33.984180 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam\") pod \"48017895-b32f-4afa-a7bf-e7e41c29d256\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " Dec 03 14:40:33 crc kubenswrapper[5004]: I1203 14:40:33.984239 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0\") pod \"48017895-b32f-4afa-a7bf-e7e41c29d256\" (UID: \"48017895-b32f-4afa-a7bf-e7e41c29d256\") " Dec 03 14:40:33 crc kubenswrapper[5004]: I1203 14:40:33.991702 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s" (OuterVolumeSpecName: "kube-api-access-rfp5s") pod "48017895-b32f-4afa-a7bf-e7e41c29d256" (UID: "48017895-b32f-4afa-a7bf-e7e41c29d256"). InnerVolumeSpecName "kube-api-access-rfp5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.012403 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "48017895-b32f-4afa-a7bf-e7e41c29d256" (UID: "48017895-b32f-4afa-a7bf-e7e41c29d256"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.034991 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "48017895-b32f-4afa-a7bf-e7e41c29d256" (UID: "48017895-b32f-4afa-a7bf-e7e41c29d256"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.086230 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.086276 5004 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48017895-b32f-4afa-a7bf-e7e41c29d256-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.086304 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfp5s\" (UniqueName: \"kubernetes.io/projected/48017895-b32f-4afa-a7bf-e7e41c29d256-kube-api-access-rfp5s\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.391075 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" event={"ID":"48017895-b32f-4afa-a7bf-e7e41c29d256","Type":"ContainerDied","Data":"e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342"} Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.391121 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05badb318262fa66a12b978c4ba18825ff8d444549b72f95643ef8bd55d8342" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.391185 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qr7k" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.475590 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p"] Dec 03 14:40:34 crc kubenswrapper[5004]: E1203 14:40:34.476325 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48017895-b32f-4afa-a7bf-e7e41c29d256" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.476421 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="48017895-b32f-4afa-a7bf-e7e41c29d256" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.476727 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="48017895-b32f-4afa-a7bf-e7e41c29d256" containerName="ssh-known-hosts-edpm-deployment" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.477408 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.479527 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.479878 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.480098 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.480952 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.491831 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p"] Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.596015 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vds5l\" (UniqueName: \"kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.596143 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.596176 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.697572 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vds5l\" (UniqueName: \"kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.697676 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.697703 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.701930 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.711510 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.718429 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vds5l\" (UniqueName: \"kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6j48p\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:34 crc kubenswrapper[5004]: I1203 14:40:34.798198 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:35 crc kubenswrapper[5004]: I1203 14:40:35.349341 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p"] Dec 03 14:40:35 crc kubenswrapper[5004]: I1203 14:40:35.437020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" event={"ID":"35e88acc-36ab-41a3-ab34-a04a3a4234de","Type":"ContainerStarted","Data":"6309e3d61175f0fe75b51c99dda57674bb5927bda4b22c694a7a5d950a8db448"} Dec 03 14:40:36 crc kubenswrapper[5004]: I1203 14:40:36.447765 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" event={"ID":"35e88acc-36ab-41a3-ab34-a04a3a4234de","Type":"ContainerStarted","Data":"89ece7b66c10e377b082cb1c08070571da8b2b0f6098c52b7aef633ec48cac1d"} Dec 03 14:40:36 crc kubenswrapper[5004]: I1203 14:40:36.467544 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" podStartSLOduration=1.948016252 podStartE2EDuration="2.467527906s" podCreationTimestamp="2025-12-03 14:40:34 +0000 UTC" firstStartedPulling="2025-12-03 14:40:35.358196623 +0000 UTC m=+2048.107166859" lastFinishedPulling="2025-12-03 14:40:35.877708257 +0000 UTC m=+2048.626678513" observedRunningTime="2025-12-03 14:40:36.459901057 +0000 UTC m=+2049.208871293" watchObservedRunningTime="2025-12-03 14:40:36.467527906 +0000 UTC m=+2049.216498142" Dec 03 14:40:37 crc kubenswrapper[5004]: I1203 14:40:37.344574 5004 scope.go:117] "RemoveContainer" containerID="3f7be2230eb25301553197e183633bb43751b39758ee910b6e4d093b90ed0a3b" Dec 03 14:40:46 crc kubenswrapper[5004]: I1203 14:40:46.539569 5004 generic.go:334] "Generic (PLEG): container finished" podID="35e88acc-36ab-41a3-ab34-a04a3a4234de" containerID="89ece7b66c10e377b082cb1c08070571da8b2b0f6098c52b7aef633ec48cac1d" exitCode=0 Dec 03 14:40:46 crc kubenswrapper[5004]: I1203 14:40:46.539625 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" event={"ID":"35e88acc-36ab-41a3-ab34-a04a3a4234de","Type":"ContainerDied","Data":"89ece7b66c10e377b082cb1c08070571da8b2b0f6098c52b7aef633ec48cac1d"} Dec 03 14:40:47 crc kubenswrapper[5004]: I1203 14:40:47.961837 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.080750 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory\") pod \"35e88acc-36ab-41a3-ab34-a04a3a4234de\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.080951 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key\") pod \"35e88acc-36ab-41a3-ab34-a04a3a4234de\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.081121 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vds5l\" (UniqueName: \"kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l\") pod \"35e88acc-36ab-41a3-ab34-a04a3a4234de\" (UID: \"35e88acc-36ab-41a3-ab34-a04a3a4234de\") " Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.086394 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l" (OuterVolumeSpecName: "kube-api-access-vds5l") pod "35e88acc-36ab-41a3-ab34-a04a3a4234de" (UID: "35e88acc-36ab-41a3-ab34-a04a3a4234de"). InnerVolumeSpecName "kube-api-access-vds5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.112085 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35e88acc-36ab-41a3-ab34-a04a3a4234de" (UID: "35e88acc-36ab-41a3-ab34-a04a3a4234de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.113288 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory" (OuterVolumeSpecName: "inventory") pod "35e88acc-36ab-41a3-ab34-a04a3a4234de" (UID: "35e88acc-36ab-41a3-ab34-a04a3a4234de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.183927 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.183965 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vds5l\" (UniqueName: \"kubernetes.io/projected/35e88acc-36ab-41a3-ab34-a04a3a4234de-kube-api-access-vds5l\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.183981 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e88acc-36ab-41a3-ab34-a04a3a4234de-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.559505 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" event={"ID":"35e88acc-36ab-41a3-ab34-a04a3a4234de","Type":"ContainerDied","Data":"6309e3d61175f0fe75b51c99dda57674bb5927bda4b22c694a7a5d950a8db448"} Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.559564 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6309e3d61175f0fe75b51c99dda57674bb5927bda4b22c694a7a5d950a8db448" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.559530 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6j48p" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.636463 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68"] Dec 03 14:40:48 crc kubenswrapper[5004]: E1203 14:40:48.636940 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e88acc-36ab-41a3-ab34-a04a3a4234de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.636956 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e88acc-36ab-41a3-ab34-a04a3a4234de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.637118 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e88acc-36ab-41a3-ab34-a04a3a4234de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.637827 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.640330 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.640443 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.641586 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.641628 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.665335 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68"] Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.794190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8h55\" (UniqueName: \"kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.794421 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.794667 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.897178 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.897247 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.897362 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8h55\" (UniqueName: \"kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.901318 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.903362 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.916041 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8h55\" (UniqueName: \"kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:48 crc kubenswrapper[5004]: I1203 14:40:48.953598 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:40:49 crc kubenswrapper[5004]: I1203 14:40:49.511389 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68"] Dec 03 14:40:49 crc kubenswrapper[5004]: I1203 14:40:49.518084 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:40:49 crc kubenswrapper[5004]: I1203 14:40:49.569709 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" event={"ID":"0c169631-8cd9-45a0-b295-026cd99d6e41","Type":"ContainerStarted","Data":"aa592d3f7e74381b4815b5ad7abe90b159fd6c574895bc91f92f0c9ce0e1dca3"} Dec 03 14:40:50 crc kubenswrapper[5004]: I1203 14:40:50.577667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" event={"ID":"0c169631-8cd9-45a0-b295-026cd99d6e41","Type":"ContainerStarted","Data":"7295a5034bbba5fee14f52b31b552041908453d8ffd75eccc9d89fa600e817c1"} Dec 03 14:40:50 crc kubenswrapper[5004]: I1203 14:40:50.595423 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" podStartSLOduration=1.8817334300000002 podStartE2EDuration="2.595407784s" podCreationTimestamp="2025-12-03 14:40:48 +0000 UTC" firstStartedPulling="2025-12-03 14:40:49.517879276 +0000 UTC m=+2062.266849512" lastFinishedPulling="2025-12-03 14:40:50.23155361 +0000 UTC m=+2062.980523866" observedRunningTime="2025-12-03 14:40:50.593841289 +0000 UTC m=+2063.342811525" watchObservedRunningTime="2025-12-03 14:40:50.595407784 +0000 UTC m=+2063.344378020" Dec 03 14:41:00 crc kubenswrapper[5004]: I1203 14:41:00.659261 5004 generic.go:334] "Generic (PLEG): container finished" podID="0c169631-8cd9-45a0-b295-026cd99d6e41" containerID="7295a5034bbba5fee14f52b31b552041908453d8ffd75eccc9d89fa600e817c1" exitCode=0 Dec 03 14:41:00 crc kubenswrapper[5004]: I1203 14:41:00.659352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" event={"ID":"0c169631-8cd9-45a0-b295-026cd99d6e41","Type":"ContainerDied","Data":"7295a5034bbba5fee14f52b31b552041908453d8ffd75eccc9d89fa600e817c1"} Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.084536 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.258017 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8h55\" (UniqueName: \"kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55\") pod \"0c169631-8cd9-45a0-b295-026cd99d6e41\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.258185 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key\") pod \"0c169631-8cd9-45a0-b295-026cd99d6e41\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.258285 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory\") pod \"0c169631-8cd9-45a0-b295-026cd99d6e41\" (UID: \"0c169631-8cd9-45a0-b295-026cd99d6e41\") " Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.264091 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55" (OuterVolumeSpecName: "kube-api-access-w8h55") pod "0c169631-8cd9-45a0-b295-026cd99d6e41" (UID: "0c169631-8cd9-45a0-b295-026cd99d6e41"). InnerVolumeSpecName "kube-api-access-w8h55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.287414 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory" (OuterVolumeSpecName: "inventory") pod "0c169631-8cd9-45a0-b295-026cd99d6e41" (UID: "0c169631-8cd9-45a0-b295-026cd99d6e41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.287948 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c169631-8cd9-45a0-b295-026cd99d6e41" (UID: "0c169631-8cd9-45a0-b295-026cd99d6e41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.361097 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.361140 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8h55\" (UniqueName: \"kubernetes.io/projected/0c169631-8cd9-45a0-b295-026cd99d6e41-kube-api-access-w8h55\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.361157 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c169631-8cd9-45a0-b295-026cd99d6e41-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.680775 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" event={"ID":"0c169631-8cd9-45a0-b295-026cd99d6e41","Type":"ContainerDied","Data":"aa592d3f7e74381b4815b5ad7abe90b159fd6c574895bc91f92f0c9ce0e1dca3"} Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.681104 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa592d3f7e74381b4815b5ad7abe90b159fd6c574895bc91f92f0c9ce0e1dca3" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.681164 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.773575 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m"] Dec 03 14:41:02 crc kubenswrapper[5004]: E1203 14:41:02.773965 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c169631-8cd9-45a0-b295-026cd99d6e41" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.773983 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c169631-8cd9-45a0-b295-026cd99d6e41" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.774168 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c169631-8cd9-45a0-b295-026cd99d6e41" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.774743 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.778115 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.778280 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.778834 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.782300 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.782482 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.782585 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.782702 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.782922 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.792885 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m"] Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.971974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972014 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972047 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972711 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972823 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972876 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972900 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972916 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972954 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.972990 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.973058 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.973131 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69kr\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.973182 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:02 crc kubenswrapper[5004]: I1203 14:41:02.973254 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69kr\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074801 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074847 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074925 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.074985 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075016 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075062 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075096 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075126 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075213 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075240 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.075277 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.080674 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.080978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.081635 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.082257 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.082380 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.082417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.083594 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.083828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.084037 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.085432 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.085749 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.087065 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.093471 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.095332 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69kr\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx45m\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.098360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.674122 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m"] Dec 03 14:41:03 crc kubenswrapper[5004]: I1203 14:41:03.689539 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" event={"ID":"4f32dcbb-677a-48e6-9c25-eaec1655a155","Type":"ContainerStarted","Data":"5acf12b0c8f48739e43f8a1fd279860384f6f745ce83c44a0651c326cfe67048"} Dec 03 14:41:04 crc kubenswrapper[5004]: I1203 14:41:04.703740 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" event={"ID":"4f32dcbb-677a-48e6-9c25-eaec1655a155","Type":"ContainerStarted","Data":"e0c5b647640ed982bcaac6b77c64f22a7e09c45054d54822143669ee71e128bf"} Dec 03 14:41:04 crc kubenswrapper[5004]: I1203 14:41:04.721919 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" podStartSLOduration=2.113744397 podStartE2EDuration="2.721905743s" podCreationTimestamp="2025-12-03 14:41:02 +0000 UTC" firstStartedPulling="2025-12-03 14:41:03.67858601 +0000 UTC m=+2076.427556246" lastFinishedPulling="2025-12-03 14:41:04.286747336 +0000 UTC m=+2077.035717592" observedRunningTime="2025-12-03 14:41:04.720465762 +0000 UTC m=+2077.469435988" watchObservedRunningTime="2025-12-03 14:41:04.721905743 +0000 UTC m=+2077.470875969" Dec 03 14:41:45 crc kubenswrapper[5004]: I1203 14:41:45.161208 5004 generic.go:334] "Generic (PLEG): container finished" podID="4f32dcbb-677a-48e6-9c25-eaec1655a155" containerID="e0c5b647640ed982bcaac6b77c64f22a7e09c45054d54822143669ee71e128bf" exitCode=0 Dec 03 14:41:45 crc kubenswrapper[5004]: I1203 14:41:45.161327 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" event={"ID":"4f32dcbb-677a-48e6-9c25-eaec1655a155","Type":"ContainerDied","Data":"e0c5b647640ed982bcaac6b77c64f22a7e09c45054d54822143669ee71e128bf"} Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.596564 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741379 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741432 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741509 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741531 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741585 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741690 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741721 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741765 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741798 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741833 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z69kr\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741851 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.741913 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.742009 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"4f32dcbb-677a-48e6-9c25-eaec1655a155\" (UID: \"4f32dcbb-677a-48e6-9c25-eaec1655a155\") " Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.748601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.749237 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.749288 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.749304 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.749991 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.750036 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.750467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.751680 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr" (OuterVolumeSpecName: "kube-api-access-z69kr") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "kube-api-access-z69kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.751833 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.752894 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.756654 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.760010 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.775215 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory" (OuterVolumeSpecName: "inventory") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.780819 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f32dcbb-677a-48e6-9c25-eaec1655a155" (UID: "4f32dcbb-677a-48e6-9c25-eaec1655a155"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.843974 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844031 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844043 5004 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844052 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844063 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z69kr\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-kube-api-access-z69kr\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844073 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844080 5004 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844090 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844100 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844110 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844121 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844133 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844145 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4f32dcbb-677a-48e6-9c25-eaec1655a155-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:46 crc kubenswrapper[5004]: I1203 14:41:46.844157 5004 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f32dcbb-677a-48e6-9c25-eaec1655a155-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.180487 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" event={"ID":"4f32dcbb-677a-48e6-9c25-eaec1655a155","Type":"ContainerDied","Data":"5acf12b0c8f48739e43f8a1fd279860384f6f745ce83c44a0651c326cfe67048"} Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.180784 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5acf12b0c8f48739e43f8a1fd279860384f6f745ce83c44a0651c326cfe67048" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.180576 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx45m" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.356472 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82"] Dec 03 14:41:47 crc kubenswrapper[5004]: E1203 14:41:47.362117 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f32dcbb-677a-48e6-9c25-eaec1655a155" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.362155 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f32dcbb-677a-48e6-9c25-eaec1655a155" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.362443 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f32dcbb-677a-48e6-9c25-eaec1655a155" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.363198 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.365769 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.366186 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.366388 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.368525 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82"] Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.369381 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.369413 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.455245 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.455336 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.455374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.455521 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.455567 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgnh\" (UniqueName: \"kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.556880 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.556963 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.557009 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.557123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.557169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjgnh\" (UniqueName: \"kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.558162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.562519 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.562603 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.567580 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.613907 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjgnh\" (UniqueName: \"kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zkt82\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:47 crc kubenswrapper[5004]: I1203 14:41:47.686707 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:41:48 crc kubenswrapper[5004]: I1203 14:41:48.236662 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82"] Dec 03 14:41:49 crc kubenswrapper[5004]: I1203 14:41:49.201461 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" event={"ID":"122d652b-2c6a-4aa2-9303-e844922d4620","Type":"ContainerStarted","Data":"0b9937bf7a51ac67f312d2383110bfab46a5ec03e6d1e7841de715d5c2187294"} Dec 03 14:41:50 crc kubenswrapper[5004]: I1203 14:41:50.212922 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" event={"ID":"122d652b-2c6a-4aa2-9303-e844922d4620","Type":"ContainerStarted","Data":"ce3cbcbedbd3405a343d491fe3dcd9fa56dbb5dc787daec252767d5e7d9aacda"} Dec 03 14:41:52 crc kubenswrapper[5004]: I1203 14:41:52.824914 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:41:52 crc kubenswrapper[5004]: I1203 14:41:52.825945 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:42:22 crc kubenswrapper[5004]: I1203 14:42:22.824527 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:42:22 crc kubenswrapper[5004]: I1203 14:42:22.825092 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.052131 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" podStartSLOduration=40.292593369 podStartE2EDuration="41.052110683s" podCreationTimestamp="2025-12-03 14:41:47 +0000 UTC" firstStartedPulling="2025-12-03 14:41:48.233031435 +0000 UTC m=+2120.982001671" lastFinishedPulling="2025-12-03 14:41:48.992548749 +0000 UTC m=+2121.741518985" observedRunningTime="2025-12-03 14:41:50.232432182 +0000 UTC m=+2122.981402438" watchObservedRunningTime="2025-12-03 14:42:28.052110683 +0000 UTC m=+2160.801080909" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.053411 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.055207 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.069378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.069670 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.071197 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwc8\" (UniqueName: \"kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.072490 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.173446 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwc8\" (UniqueName: \"kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.173566 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.173590 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.174314 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.174386 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.202624 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwc8\" (UniqueName: \"kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8\") pod \"certified-operators-jx6vr\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.375765 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:28 crc kubenswrapper[5004]: I1203 14:42:28.903771 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:29 crc kubenswrapper[5004]: I1203 14:42:29.593887 5004 generic.go:334] "Generic (PLEG): container finished" podID="5baf8fca-5164-428b-b030-598b020c0587" containerID="085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9" exitCode=0 Dec 03 14:42:29 crc kubenswrapper[5004]: I1203 14:42:29.593973 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerDied","Data":"085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9"} Dec 03 14:42:29 crc kubenswrapper[5004]: I1203 14:42:29.594192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerStarted","Data":"1b33df87c48d5df9af36cfb31a93a706e928a09d3b82040af8f3f8d528532c31"} Dec 03 14:42:32 crc kubenswrapper[5004]: I1203 14:42:32.629673 5004 generic.go:334] "Generic (PLEG): container finished" podID="5baf8fca-5164-428b-b030-598b020c0587" containerID="74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888" exitCode=0 Dec 03 14:42:32 crc kubenswrapper[5004]: I1203 14:42:32.629766 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerDied","Data":"74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888"} Dec 03 14:42:34 crc kubenswrapper[5004]: I1203 14:42:34.652229 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerStarted","Data":"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae"} Dec 03 14:42:34 crc kubenswrapper[5004]: I1203 14:42:34.676717 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jx6vr" podStartSLOduration=2.835351974 podStartE2EDuration="6.676692475s" podCreationTimestamp="2025-12-03 14:42:28 +0000 UTC" firstStartedPulling="2025-12-03 14:42:29.595778691 +0000 UTC m=+2162.344748927" lastFinishedPulling="2025-12-03 14:42:33.437119192 +0000 UTC m=+2166.186089428" observedRunningTime="2025-12-03 14:42:34.67029379 +0000 UTC m=+2167.419264026" watchObservedRunningTime="2025-12-03 14:42:34.676692475 +0000 UTC m=+2167.425662711" Dec 03 14:42:38 crc kubenswrapper[5004]: I1203 14:42:38.376287 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:38 crc kubenswrapper[5004]: I1203 14:42:38.377532 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:38 crc kubenswrapper[5004]: I1203 14:42:38.438066 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:38 crc kubenswrapper[5004]: I1203 14:42:38.753272 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:38 crc kubenswrapper[5004]: I1203 14:42:38.798329 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:40 crc kubenswrapper[5004]: I1203 14:42:40.711623 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jx6vr" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="registry-server" containerID="cri-o://9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae" gracePeriod=2 Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.717740 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.720290 5004 generic.go:334] "Generic (PLEG): container finished" podID="5baf8fca-5164-428b-b030-598b020c0587" containerID="9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae" exitCode=0 Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.720342 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerDied","Data":"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae"} Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.720374 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jx6vr" event={"ID":"5baf8fca-5164-428b-b030-598b020c0587","Type":"ContainerDied","Data":"1b33df87c48d5df9af36cfb31a93a706e928a09d3b82040af8f3f8d528532c31"} Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.720394 5004 scope.go:117] "RemoveContainer" containerID="9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.746681 5004 scope.go:117] "RemoveContainer" containerID="74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.777044 5004 scope.go:117] "RemoveContainer" containerID="085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.823832 5004 scope.go:117] "RemoveContainer" containerID="9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae" Dec 03 14:42:41 crc kubenswrapper[5004]: E1203 14:42:41.824995 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae\": container with ID starting with 9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae not found: ID does not exist" containerID="9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.825039 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae"} err="failed to get container status \"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae\": rpc error: code = NotFound desc = could not find container \"9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae\": container with ID starting with 9702b8c78d596ec598fc308178a30410e09d0c1a79c37fe0aea94700eadf8fae not found: ID does not exist" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.825090 5004 scope.go:117] "RemoveContainer" containerID="74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888" Dec 03 14:42:41 crc kubenswrapper[5004]: E1203 14:42:41.825434 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888\": container with ID starting with 74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888 not found: ID does not exist" containerID="74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.825478 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888"} err="failed to get container status \"74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888\": rpc error: code = NotFound desc = could not find container \"74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888\": container with ID starting with 74d99d0a30734b8d5a49240679f18ab14a42db80effb28e8ebf05829c4c24888 not found: ID does not exist" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.825505 5004 scope.go:117] "RemoveContainer" containerID="085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9" Dec 03 14:42:41 crc kubenswrapper[5004]: E1203 14:42:41.825807 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9\": container with ID starting with 085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9 not found: ID does not exist" containerID="085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.825844 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9"} err="failed to get container status \"085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9\": rpc error: code = NotFound desc = could not find container \"085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9\": container with ID starting with 085f2bbc4eb780fc31fa5791907a4a1cbaf6bc3fe1b5f52adb976032f4b2c4d9 not found: ID does not exist" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.860793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities\") pod \"5baf8fca-5164-428b-b030-598b020c0587\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.860896 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content\") pod \"5baf8fca-5164-428b-b030-598b020c0587\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.860939 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fwc8\" (UniqueName: \"kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8\") pod \"5baf8fca-5164-428b-b030-598b020c0587\" (UID: \"5baf8fca-5164-428b-b030-598b020c0587\") " Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.861999 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities" (OuterVolumeSpecName: "utilities") pod "5baf8fca-5164-428b-b030-598b020c0587" (UID: "5baf8fca-5164-428b-b030-598b020c0587"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.870083 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8" (OuterVolumeSpecName: "kube-api-access-2fwc8") pod "5baf8fca-5164-428b-b030-598b020c0587" (UID: "5baf8fca-5164-428b-b030-598b020c0587"). InnerVolumeSpecName "kube-api-access-2fwc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.915084 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5baf8fca-5164-428b-b030-598b020c0587" (UID: "5baf8fca-5164-428b-b030-598b020c0587"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.963477 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.963519 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5baf8fca-5164-428b-b030-598b020c0587-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:41 crc kubenswrapper[5004]: I1203 14:42:41.963533 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fwc8\" (UniqueName: \"kubernetes.io/projected/5baf8fca-5164-428b-b030-598b020c0587-kube-api-access-2fwc8\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.728062 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jx6vr" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.762239 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.770326 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jx6vr"] Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.887284 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:42:42 crc kubenswrapper[5004]: E1203 14:42:42.887935 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="extract-content" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.887951 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="extract-content" Dec 03 14:42:42 crc kubenswrapper[5004]: E1203 14:42:42.887969 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="extract-utilities" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.887977 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="extract-utilities" Dec 03 14:42:42 crc kubenswrapper[5004]: E1203 14:42:42.887987 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="registry-server" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.887993 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="registry-server" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.888211 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5baf8fca-5164-428b-b030-598b020c0587" containerName="registry-server" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.889788 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.900944 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.977612 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.977894 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvqg\" (UniqueName: \"kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:42 crc kubenswrapper[5004]: I1203 14:42:42.978021 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.080482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.080564 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvqg\" (UniqueName: \"kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.080657 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.081068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.081187 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.098137 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvqg\" (UniqueName: \"kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg\") pod \"redhat-marketplace-6b6sr\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.211728 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.623910 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5baf8fca-5164-428b-b030-598b020c0587" path="/var/lib/kubelet/pods/5baf8fca-5164-428b-b030-598b020c0587/volumes" Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.698725 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:42:43 crc kubenswrapper[5004]: I1203 14:42:43.741072 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerStarted","Data":"11794cc1476c82b73df1c13472e246be90c488acca3d8e279836cd00e9e2415e"} Dec 03 14:42:44 crc kubenswrapper[5004]: I1203 14:42:44.750883 5004 generic.go:334] "Generic (PLEG): container finished" podID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerID="2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db" exitCode=0 Dec 03 14:42:44 crc kubenswrapper[5004]: I1203 14:42:44.750933 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerDied","Data":"2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db"} Dec 03 14:42:47 crc kubenswrapper[5004]: I1203 14:42:47.780596 5004 generic.go:334] "Generic (PLEG): container finished" podID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerID="cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9" exitCode=0 Dec 03 14:42:47 crc kubenswrapper[5004]: I1203 14:42:47.780662 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerDied","Data":"cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9"} Dec 03 14:42:49 crc kubenswrapper[5004]: I1203 14:42:49.800044 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerStarted","Data":"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80"} Dec 03 14:42:49 crc kubenswrapper[5004]: I1203 14:42:49.802345 5004 generic.go:334] "Generic (PLEG): container finished" podID="122d652b-2c6a-4aa2-9303-e844922d4620" containerID="ce3cbcbedbd3405a343d491fe3dcd9fa56dbb5dc787daec252767d5e7d9aacda" exitCode=0 Dec 03 14:42:49 crc kubenswrapper[5004]: I1203 14:42:49.802390 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" event={"ID":"122d652b-2c6a-4aa2-9303-e844922d4620","Type":"ContainerDied","Data":"ce3cbcbedbd3405a343d491fe3dcd9fa56dbb5dc787daec252767d5e7d9aacda"} Dec 03 14:42:49 crc kubenswrapper[5004]: I1203 14:42:49.826846 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6b6sr" podStartSLOduration=3.225725418 podStartE2EDuration="7.82682325s" podCreationTimestamp="2025-12-03 14:42:42 +0000 UTC" firstStartedPulling="2025-12-03 14:42:44.752553057 +0000 UTC m=+2177.501523293" lastFinishedPulling="2025-12-03 14:42:49.353650889 +0000 UTC m=+2182.102621125" observedRunningTime="2025-12-03 14:42:49.823309309 +0000 UTC m=+2182.572279545" watchObservedRunningTime="2025-12-03 14:42:49.82682325 +0000 UTC m=+2182.575793486" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.258420 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.261253 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0\") pod \"122d652b-2c6a-4aa2-9303-e844922d4620\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.261328 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjgnh\" (UniqueName: \"kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh\") pod \"122d652b-2c6a-4aa2-9303-e844922d4620\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.261359 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key\") pod \"122d652b-2c6a-4aa2-9303-e844922d4620\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.261388 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory\") pod \"122d652b-2c6a-4aa2-9303-e844922d4620\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.261442 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle\") pod \"122d652b-2c6a-4aa2-9303-e844922d4620\" (UID: \"122d652b-2c6a-4aa2-9303-e844922d4620\") " Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.267256 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "122d652b-2c6a-4aa2-9303-e844922d4620" (UID: "122d652b-2c6a-4aa2-9303-e844922d4620"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.271313 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh" (OuterVolumeSpecName: "kube-api-access-rjgnh") pod "122d652b-2c6a-4aa2-9303-e844922d4620" (UID: "122d652b-2c6a-4aa2-9303-e844922d4620"). InnerVolumeSpecName "kube-api-access-rjgnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.296683 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "122d652b-2c6a-4aa2-9303-e844922d4620" (UID: "122d652b-2c6a-4aa2-9303-e844922d4620"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.306145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory" (OuterVolumeSpecName: "inventory") pod "122d652b-2c6a-4aa2-9303-e844922d4620" (UID: "122d652b-2c6a-4aa2-9303-e844922d4620"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.320072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "122d652b-2c6a-4aa2-9303-e844922d4620" (UID: "122d652b-2c6a-4aa2-9303-e844922d4620"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.365161 5004 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/122d652b-2c6a-4aa2-9303-e844922d4620-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.365211 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjgnh\" (UniqueName: \"kubernetes.io/projected/122d652b-2c6a-4aa2-9303-e844922d4620-kube-api-access-rjgnh\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.365226 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.365239 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.365251 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d652b-2c6a-4aa2-9303-e844922d4620-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.819064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" event={"ID":"122d652b-2c6a-4aa2-9303-e844922d4620","Type":"ContainerDied","Data":"0b9937bf7a51ac67f312d2383110bfab46a5ec03e6d1e7841de715d5c2187294"} Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.819378 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b9937bf7a51ac67f312d2383110bfab46a5ec03e6d1e7841de715d5c2187294" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.819193 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zkt82" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.936983 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c"] Dec 03 14:42:51 crc kubenswrapper[5004]: E1203 14:42:51.937395 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122d652b-2c6a-4aa2-9303-e844922d4620" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.937417 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="122d652b-2c6a-4aa2-9303-e844922d4620" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.938104 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="122d652b-2c6a-4aa2-9303-e844922d4620" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.938911 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.941434 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.942143 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.942347 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.942530 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.942748 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.942781 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:42:51 crc kubenswrapper[5004]: I1203 14:42:51.952329 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c"] Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.080560 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.080610 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.080636 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtb6\" (UniqueName: \"kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.080677 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.081539 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.081635 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.183803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.183899 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.184006 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.184042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.184070 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtb6\" (UniqueName: \"kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.184107 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.189407 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.189602 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.190379 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.190449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.193692 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.201294 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtb6\" (UniqueName: \"kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.267809 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.816101 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c"] Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.824844 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.824968 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.825025 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.825881 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.825932 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29" gracePeriod=600 Dec 03 14:42:52 crc kubenswrapper[5004]: I1203 14:42:52.831549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" event={"ID":"752f8ea2-1e21-4ff4-aac9-4f1a5f662561","Type":"ContainerStarted","Data":"1a663fbdb867908ec590d28b3f011935f3440297f93d1a7232c1fec0bd02e4d3"} Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.212133 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.212387 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.277536 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.866469 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29" exitCode=0 Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.866568 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29"} Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.867261 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8"} Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.867294 5004 scope.go:117] "RemoveContainer" containerID="c1bbae7a0ffbd2d37576ccfb95148cb9eaf76e3e8fc411226627365c0f78ffbb" Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.873725 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" event={"ID":"752f8ea2-1e21-4ff4-aac9-4f1a5f662561","Type":"ContainerStarted","Data":"b7fb0dc95e94a40bae1da27d6c69bd860c717b0a84a9e651f5584b43197a0e08"} Dec 03 14:42:53 crc kubenswrapper[5004]: I1203 14:42:53.907948 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" podStartSLOduration=2.522691123 podStartE2EDuration="2.907927983s" podCreationTimestamp="2025-12-03 14:42:51 +0000 UTC" firstStartedPulling="2025-12-03 14:42:52.823725832 +0000 UTC m=+2185.572696068" lastFinishedPulling="2025-12-03 14:42:53.208962692 +0000 UTC m=+2185.957932928" observedRunningTime="2025-12-03 14:42:53.901025644 +0000 UTC m=+2186.649995880" watchObservedRunningTime="2025-12-03 14:42:53.907927983 +0000 UTC m=+2186.656898219" Dec 03 14:43:03 crc kubenswrapper[5004]: I1203 14:43:03.260339 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:43:03 crc kubenswrapper[5004]: I1203 14:43:03.315251 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:43:03 crc kubenswrapper[5004]: I1203 14:43:03.973647 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6b6sr" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="registry-server" containerID="cri-o://bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80" gracePeriod=2 Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.966407 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.985624 5004 generic.go:334] "Generic (PLEG): container finished" podID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerID="bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80" exitCode=0 Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.985666 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerDied","Data":"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80"} Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.985693 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6b6sr" event={"ID":"416a664b-73d0-4a4f-88d0-ac86e50321a2","Type":"ContainerDied","Data":"11794cc1476c82b73df1c13472e246be90c488acca3d8e279836cd00e9e2415e"} Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.985716 5004 scope.go:117] "RemoveContainer" containerID="bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80" Dec 03 14:43:04 crc kubenswrapper[5004]: I1203 14:43:04.985757 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6b6sr" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.016053 5004 scope.go:117] "RemoveContainer" containerID="cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.036824 5004 scope.go:117] "RemoveContainer" containerID="2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.088395 5004 scope.go:117] "RemoveContainer" containerID="bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80" Dec 03 14:43:05 crc kubenswrapper[5004]: E1203 14:43:05.089015 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80\": container with ID starting with bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80 not found: ID does not exist" containerID="bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.089051 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80"} err="failed to get container status \"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80\": rpc error: code = NotFound desc = could not find container \"bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80\": container with ID starting with bb9cebba4735c265c2613dacd7ef7ec1bd91358ca2954e5378c66be8f7439f80 not found: ID does not exist" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.089077 5004 scope.go:117] "RemoveContainer" containerID="cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9" Dec 03 14:43:05 crc kubenswrapper[5004]: E1203 14:43:05.089321 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9\": container with ID starting with cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9 not found: ID does not exist" containerID="cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.089348 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9"} err="failed to get container status \"cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9\": rpc error: code = NotFound desc = could not find container \"cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9\": container with ID starting with cb9f2053ab043999e0739aa5e6a7a03988d9519dd9bfc5888ef0ba6cd59f7fa9 not found: ID does not exist" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.089362 5004 scope.go:117] "RemoveContainer" containerID="2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db" Dec 03 14:43:05 crc kubenswrapper[5004]: E1203 14:43:05.089720 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db\": container with ID starting with 2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db not found: ID does not exist" containerID="2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.089747 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db"} err="failed to get container status \"2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db\": rpc error: code = NotFound desc = could not find container \"2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db\": container with ID starting with 2103864c2872ae71a1c1c662996824ca8b3bca3fcd9f0f76c8f5f1a8758a61db not found: ID does not exist" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.128568 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbvqg\" (UniqueName: \"kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg\") pod \"416a664b-73d0-4a4f-88d0-ac86e50321a2\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.128625 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content\") pod \"416a664b-73d0-4a4f-88d0-ac86e50321a2\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.128920 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities\") pod \"416a664b-73d0-4a4f-88d0-ac86e50321a2\" (UID: \"416a664b-73d0-4a4f-88d0-ac86e50321a2\") " Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.131016 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities" (OuterVolumeSpecName: "utilities") pod "416a664b-73d0-4a4f-88d0-ac86e50321a2" (UID: "416a664b-73d0-4a4f-88d0-ac86e50321a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.136282 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg" (OuterVolumeSpecName: "kube-api-access-kbvqg") pod "416a664b-73d0-4a4f-88d0-ac86e50321a2" (UID: "416a664b-73d0-4a4f-88d0-ac86e50321a2"). InnerVolumeSpecName "kube-api-access-kbvqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.150532 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "416a664b-73d0-4a4f-88d0-ac86e50321a2" (UID: "416a664b-73d0-4a4f-88d0-ac86e50321a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.230531 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbvqg\" (UniqueName: \"kubernetes.io/projected/416a664b-73d0-4a4f-88d0-ac86e50321a2-kube-api-access-kbvqg\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.230579 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.230588 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/416a664b-73d0-4a4f-88d0-ac86e50321a2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.323660 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.333016 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6b6sr"] Dec 03 14:43:05 crc kubenswrapper[5004]: I1203 14:43:05.623487 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" path="/var/lib/kubelet/pods/416a664b-73d0-4a4f-88d0-ac86e50321a2/volumes" Dec 03 14:43:39 crc kubenswrapper[5004]: I1203 14:43:39.291629 5004 generic.go:334] "Generic (PLEG): container finished" podID="752f8ea2-1e21-4ff4-aac9-4f1a5f662561" containerID="b7fb0dc95e94a40bae1da27d6c69bd860c717b0a84a9e651f5584b43197a0e08" exitCode=0 Dec 03 14:43:39 crc kubenswrapper[5004]: I1203 14:43:39.291691 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" event={"ID":"752f8ea2-1e21-4ff4-aac9-4f1a5f662561","Type":"ContainerDied","Data":"b7fb0dc95e94a40bae1da27d6c69bd860c717b0a84a9e651f5584b43197a0e08"} Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.772072 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905122 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905619 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905699 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmtb6\" (UniqueName: \"kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905739 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905894 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.905947 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory\") pod \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\" (UID: \"752f8ea2-1e21-4ff4-aac9-4f1a5f662561\") " Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.913265 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6" (OuterVolumeSpecName: "kube-api-access-vmtb6") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "kube-api-access-vmtb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.913956 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.940047 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.943039 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory" (OuterVolumeSpecName: "inventory") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.943642 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:43:40 crc kubenswrapper[5004]: I1203 14:43:40.967480 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "752f8ea2-1e21-4ff4-aac9-4f1a5f662561" (UID: "752f8ea2-1e21-4ff4-aac9-4f1a5f662561"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008627 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008676 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008693 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008704 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008718 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmtb6\" (UniqueName: \"kubernetes.io/projected/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-kube-api-access-vmtb6\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.008729 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/752f8ea2-1e21-4ff4-aac9-4f1a5f662561-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.313657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" event={"ID":"752f8ea2-1e21-4ff4-aac9-4f1a5f662561","Type":"ContainerDied","Data":"1a663fbdb867908ec590d28b3f011935f3440297f93d1a7232c1fec0bd02e4d3"} Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.313720 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a663fbdb867908ec590d28b3f011935f3440297f93d1a7232c1fec0bd02e4d3" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.313802 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.431111 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5"] Dec 03 14:43:41 crc kubenswrapper[5004]: E1203 14:43:41.431910 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="extract-utilities" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.431929 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="extract-utilities" Dec 03 14:43:41 crc kubenswrapper[5004]: E1203 14:43:41.431958 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="extract-content" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.431965 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="extract-content" Dec 03 14:43:41 crc kubenswrapper[5004]: E1203 14:43:41.431978 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="registry-server" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.431985 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="registry-server" Dec 03 14:43:41 crc kubenswrapper[5004]: E1203 14:43:41.432005 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752f8ea2-1e21-4ff4-aac9-4f1a5f662561" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.432013 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="752f8ea2-1e21-4ff4-aac9-4f1a5f662561" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.432175 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="416a664b-73d0-4a4f-88d0-ac86e50321a2" containerName="registry-server" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.432190 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="752f8ea2-1e21-4ff4-aac9-4f1a5f662561" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.432812 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.436108 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.436113 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.436187 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.439121 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.439527 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.449594 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5"] Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.618320 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.618562 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldswd\" (UniqueName: \"kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.618601 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.618652 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.618671 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.721141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.721575 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldswd\" (UniqueName: \"kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.721638 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.721728 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.721759 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.726502 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.727258 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.727732 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.738390 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.740367 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldswd\" (UniqueName: \"kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-22sj5\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:41 crc kubenswrapper[5004]: I1203 14:43:41.756587 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:43:42 crc kubenswrapper[5004]: I1203 14:43:42.284949 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5"] Dec 03 14:43:42 crc kubenswrapper[5004]: I1203 14:43:42.325192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" event={"ID":"3e3f3f7f-8810-4c7f-b3b0-975700874959","Type":"ContainerStarted","Data":"9c5c133b8675dc55bad037f2f44a2ec727151a407be5e6c8193c5ed7e7177376"} Dec 03 14:43:43 crc kubenswrapper[5004]: I1203 14:43:43.334771 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" event={"ID":"3e3f3f7f-8810-4c7f-b3b0-975700874959","Type":"ContainerStarted","Data":"24aa78da0edf5ee25a0e5283f60e576d923173ef281bebda6a08daca2ba3fc1f"} Dec 03 14:43:43 crc kubenswrapper[5004]: I1203 14:43:43.353455 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" podStartSLOduration=1.800618327 podStartE2EDuration="2.353415742s" podCreationTimestamp="2025-12-03 14:43:41 +0000 UTC" firstStartedPulling="2025-12-03 14:43:42.291258689 +0000 UTC m=+2235.040228925" lastFinishedPulling="2025-12-03 14:43:42.844056104 +0000 UTC m=+2235.593026340" observedRunningTime="2025-12-03 14:43:43.350381355 +0000 UTC m=+2236.099351601" watchObservedRunningTime="2025-12-03 14:43:43.353415742 +0000 UTC m=+2236.102385978" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.176047 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8"] Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.178733 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.181451 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.182300 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.185794 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8"] Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.354844 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.354920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.355008 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz59b\" (UniqueName: \"kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.456840 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.456919 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.456981 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz59b\" (UniqueName: \"kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.458027 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.470707 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.475071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz59b\" (UniqueName: \"kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b\") pod \"collect-profiles-29412885-vjqt8\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.510034 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:00 crc kubenswrapper[5004]: I1203 14:45:00.973750 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8"] Dec 03 14:45:01 crc kubenswrapper[5004]: I1203 14:45:01.031987 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" event={"ID":"98315f88-3a46-471d-8482-650d59c8abbb","Type":"ContainerStarted","Data":"9860a5a6622d5e3cbd0112d5a275c7e45561c3a942ea48632f88107a9b177dd3"} Dec 03 14:45:02 crc kubenswrapper[5004]: I1203 14:45:02.044769 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" event={"ID":"98315f88-3a46-471d-8482-650d59c8abbb","Type":"ContainerStarted","Data":"b7f0da29b5ed22a7b5f05536f04502c97206d586a6d8bcf46c3ba796b5b92355"} Dec 03 14:45:02 crc kubenswrapper[5004]: I1203 14:45:02.068884 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" podStartSLOduration=2.068836616 podStartE2EDuration="2.068836616s" podCreationTimestamp="2025-12-03 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 14:45:02.057032217 +0000 UTC m=+2314.806002453" watchObservedRunningTime="2025-12-03 14:45:02.068836616 +0000 UTC m=+2314.817806892" Dec 03 14:45:03 crc kubenswrapper[5004]: I1203 14:45:03.053640 5004 generic.go:334] "Generic (PLEG): container finished" podID="98315f88-3a46-471d-8482-650d59c8abbb" containerID="b7f0da29b5ed22a7b5f05536f04502c97206d586a6d8bcf46c3ba796b5b92355" exitCode=0 Dec 03 14:45:03 crc kubenswrapper[5004]: I1203 14:45:03.053948 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" event={"ID":"98315f88-3a46-471d-8482-650d59c8abbb","Type":"ContainerDied","Data":"b7f0da29b5ed22a7b5f05536f04502c97206d586a6d8bcf46c3ba796b5b92355"} Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.377982 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.556629 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume\") pod \"98315f88-3a46-471d-8482-650d59c8abbb\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.556937 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz59b\" (UniqueName: \"kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b\") pod \"98315f88-3a46-471d-8482-650d59c8abbb\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.557129 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume\") pod \"98315f88-3a46-471d-8482-650d59c8abbb\" (UID: \"98315f88-3a46-471d-8482-650d59c8abbb\") " Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.558221 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume" (OuterVolumeSpecName: "config-volume") pod "98315f88-3a46-471d-8482-650d59c8abbb" (UID: "98315f88-3a46-471d-8482-650d59c8abbb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.565309 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98315f88-3a46-471d-8482-650d59c8abbb" (UID: "98315f88-3a46-471d-8482-650d59c8abbb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.570444 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b" (OuterVolumeSpecName: "kube-api-access-zz59b") pod "98315f88-3a46-471d-8482-650d59c8abbb" (UID: "98315f88-3a46-471d-8482-650d59c8abbb"). InnerVolumeSpecName "kube-api-access-zz59b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.659343 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98315f88-3a46-471d-8482-650d59c8abbb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.659386 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz59b\" (UniqueName: \"kubernetes.io/projected/98315f88-3a46-471d-8482-650d59c8abbb-kube-api-access-zz59b\") on node \"crc\" DevicePath \"\"" Dec 03 14:45:04 crc kubenswrapper[5004]: I1203 14:45:04.659395 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98315f88-3a46-471d-8482-650d59c8abbb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.084816 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" event={"ID":"98315f88-3a46-471d-8482-650d59c8abbb","Type":"ContainerDied","Data":"9860a5a6622d5e3cbd0112d5a275c7e45561c3a942ea48632f88107a9b177dd3"} Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.084898 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9860a5a6622d5e3cbd0112d5a275c7e45561c3a942ea48632f88107a9b177dd3" Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.084981 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412885-vjqt8" Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.149176 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7"] Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.157546 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412840-v4lg7"] Dec 03 14:45:05 crc kubenswrapper[5004]: I1203 14:45:05.628806 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ebcf96-e3f0-4036-983c-c38f9f88ac4f" path="/var/lib/kubelet/pods/b0ebcf96-e3f0-4036-983c-c38f9f88ac4f/volumes" Dec 03 14:45:22 crc kubenswrapper[5004]: I1203 14:45:22.824548 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:45:22 crc kubenswrapper[5004]: I1203 14:45:22.825172 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:45:37 crc kubenswrapper[5004]: I1203 14:45:37.557493 5004 scope.go:117] "RemoveContainer" containerID="d11d97ce45bcdba61708e9322b0083ff25a7dcd5c672c72780a53313cbabbfb7" Dec 03 14:45:52 crc kubenswrapper[5004]: I1203 14:45:52.824851 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:45:52 crc kubenswrapper[5004]: I1203 14:45:52.825583 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:46:22 crc kubenswrapper[5004]: I1203 14:46:22.824848 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:46:22 crc kubenswrapper[5004]: I1203 14:46:22.825521 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:46:22 crc kubenswrapper[5004]: I1203 14:46:22.825583 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:46:22 crc kubenswrapper[5004]: I1203 14:46:22.826398 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:46:22 crc kubenswrapper[5004]: I1203 14:46:22.826488 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" gracePeriod=600 Dec 03 14:46:23 crc kubenswrapper[5004]: E1203 14:46:23.039953 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:46:23 crc kubenswrapper[5004]: I1203 14:46:23.822312 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" exitCode=0 Dec 03 14:46:23 crc kubenswrapper[5004]: I1203 14:46:23.822372 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8"} Dec 03 14:46:23 crc kubenswrapper[5004]: I1203 14:46:23.822704 5004 scope.go:117] "RemoveContainer" containerID="0b0b32460eca176637d73c06077cdcb8ff0e39c9e13a9e65acb17fa516210c29" Dec 03 14:46:23 crc kubenswrapper[5004]: I1203 14:46:23.823426 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:46:23 crc kubenswrapper[5004]: E1203 14:46:23.823678 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:46:36 crc kubenswrapper[5004]: I1203 14:46:36.612682 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:46:36 crc kubenswrapper[5004]: E1203 14:46:36.614880 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:46:51 crc kubenswrapper[5004]: I1203 14:46:51.613661 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:46:51 crc kubenswrapper[5004]: E1203 14:46:51.614825 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:47:02 crc kubenswrapper[5004]: I1203 14:47:02.612776 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:47:02 crc kubenswrapper[5004]: E1203 14:47:02.613563 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:47:15 crc kubenswrapper[5004]: I1203 14:47:15.614052 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:47:15 crc kubenswrapper[5004]: E1203 14:47:15.615365 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:47:28 crc kubenswrapper[5004]: I1203 14:47:28.613720 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:47:28 crc kubenswrapper[5004]: E1203 14:47:28.614659 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:47:43 crc kubenswrapper[5004]: I1203 14:47:43.612778 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:47:43 crc kubenswrapper[5004]: E1203 14:47:43.613726 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:47:47 crc kubenswrapper[5004]: I1203 14:47:47.598536 5004 generic.go:334] "Generic (PLEG): container finished" podID="3e3f3f7f-8810-4c7f-b3b0-975700874959" containerID="24aa78da0edf5ee25a0e5283f60e576d923173ef281bebda6a08daca2ba3fc1f" exitCode=0 Dec 03 14:47:47 crc kubenswrapper[5004]: I1203 14:47:47.598664 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" event={"ID":"3e3f3f7f-8810-4c7f-b3b0-975700874959","Type":"ContainerDied","Data":"24aa78da0edf5ee25a0e5283f60e576d923173ef281bebda6a08daca2ba3fc1f"} Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.075116 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.153017 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0\") pod \"3e3f3f7f-8810-4c7f-b3b0-975700874959\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.153129 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key\") pod \"3e3f3f7f-8810-4c7f-b3b0-975700874959\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.153224 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory\") pod \"3e3f3f7f-8810-4c7f-b3b0-975700874959\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.153266 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldswd\" (UniqueName: \"kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd\") pod \"3e3f3f7f-8810-4c7f-b3b0-975700874959\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.153318 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle\") pod \"3e3f3f7f-8810-4c7f-b3b0-975700874959\" (UID: \"3e3f3f7f-8810-4c7f-b3b0-975700874959\") " Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.160183 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3e3f3f7f-8810-4c7f-b3b0-975700874959" (UID: "3e3f3f7f-8810-4c7f-b3b0-975700874959"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.166057 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd" (OuterVolumeSpecName: "kube-api-access-ldswd") pod "3e3f3f7f-8810-4c7f-b3b0-975700874959" (UID: "3e3f3f7f-8810-4c7f-b3b0-975700874959"). InnerVolumeSpecName "kube-api-access-ldswd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.181219 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3e3f3f7f-8810-4c7f-b3b0-975700874959" (UID: "3e3f3f7f-8810-4c7f-b3b0-975700874959"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.193698 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e3f3f7f-8810-4c7f-b3b0-975700874959" (UID: "3e3f3f7f-8810-4c7f-b3b0-975700874959"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.218673 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory" (OuterVolumeSpecName: "inventory") pod "3e3f3f7f-8810-4c7f-b3b0-975700874959" (UID: "3e3f3f7f-8810-4c7f-b3b0-975700874959"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.255107 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.255145 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.255159 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.255174 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldswd\" (UniqueName: \"kubernetes.io/projected/3e3f3f7f-8810-4c7f-b3b0-975700874959-kube-api-access-ldswd\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.255189 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3f3f7f-8810-4c7f-b3b0-975700874959-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.620478 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.627434 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-22sj5" event={"ID":"3e3f3f7f-8810-4c7f-b3b0-975700874959","Type":"ContainerDied","Data":"9c5c133b8675dc55bad037f2f44a2ec727151a407be5e6c8193c5ed7e7177376"} Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.627692 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c5c133b8675dc55bad037f2f44a2ec727151a407be5e6c8193c5ed7e7177376" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.722169 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz"] Dec 03 14:47:49 crc kubenswrapper[5004]: E1203 14:47:49.722987 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98315f88-3a46-471d-8482-650d59c8abbb" containerName="collect-profiles" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.723011 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="98315f88-3a46-471d-8482-650d59c8abbb" containerName="collect-profiles" Dec 03 14:47:49 crc kubenswrapper[5004]: E1203 14:47:49.723031 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3f3f7f-8810-4c7f-b3b0-975700874959" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.723041 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3f3f7f-8810-4c7f-b3b0-975700874959" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.723242 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3f3f7f-8810-4c7f-b3b0-975700874959" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.723266 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="98315f88-3a46-471d-8482-650d59c8abbb" containerName="collect-profiles" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.723995 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.727085 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.728280 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.728547 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.728668 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.728774 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.728931 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.732790 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.739796 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz"] Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868185 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868225 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868257 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868488 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868649 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zpj\" (UniqueName: \"kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868719 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868755 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.868779 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.969959 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970054 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zpj\" (UniqueName: \"kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970133 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970162 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970199 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970259 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.970285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.971480 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.976338 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.976474 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.976691 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.976810 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.977481 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.977921 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.978072 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:49 crc kubenswrapper[5004]: I1203 14:47:49.991768 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zpj\" (UniqueName: \"kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ffvrz\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:50 crc kubenswrapper[5004]: I1203 14:47:50.043095 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:47:50 crc kubenswrapper[5004]: I1203 14:47:50.564474 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz"] Dec 03 14:47:50 crc kubenswrapper[5004]: W1203 14:47:50.569390 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a75e28_35af_4a42_ae5c_ac1a24ba78ee.slice/crio-724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5 WatchSource:0}: Error finding container 724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5: Status 404 returned error can't find the container with id 724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5 Dec 03 14:47:50 crc kubenswrapper[5004]: I1203 14:47:50.571931 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:47:50 crc kubenswrapper[5004]: I1203 14:47:50.630362 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" event={"ID":"32a75e28-35af-4a42-ae5c-ac1a24ba78ee","Type":"ContainerStarted","Data":"724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5"} Dec 03 14:47:51 crc kubenswrapper[5004]: I1203 14:47:51.639001 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" event={"ID":"32a75e28-35af-4a42-ae5c-ac1a24ba78ee","Type":"ContainerStarted","Data":"b1023164279ce7ae1337b5917ce064453f1371b8274ad9d044f32430ea1757e4"} Dec 03 14:47:51 crc kubenswrapper[5004]: I1203 14:47:51.657484 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" podStartSLOduration=2.279317773 podStartE2EDuration="2.657467212s" podCreationTimestamp="2025-12-03 14:47:49 +0000 UTC" firstStartedPulling="2025-12-03 14:47:50.571580038 +0000 UTC m=+2483.320550274" lastFinishedPulling="2025-12-03 14:47:50.949729457 +0000 UTC m=+2483.698699713" observedRunningTime="2025-12-03 14:47:51.656389171 +0000 UTC m=+2484.405359417" watchObservedRunningTime="2025-12-03 14:47:51.657467212 +0000 UTC m=+2484.406437448" Dec 03 14:47:55 crc kubenswrapper[5004]: I1203 14:47:55.613526 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:47:55 crc kubenswrapper[5004]: E1203 14:47:55.614634 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:48:08 crc kubenswrapper[5004]: I1203 14:48:08.613482 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:48:08 crc kubenswrapper[5004]: E1203 14:48:08.614231 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:48:23 crc kubenswrapper[5004]: I1203 14:48:23.613545 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:48:23 crc kubenswrapper[5004]: E1203 14:48:23.614301 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:48:37 crc kubenswrapper[5004]: I1203 14:48:37.613416 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:48:37 crc kubenswrapper[5004]: E1203 14:48:37.614581 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:48:51 crc kubenswrapper[5004]: I1203 14:48:51.612586 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:48:51 crc kubenswrapper[5004]: E1203 14:48:51.613374 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:49:02 crc kubenswrapper[5004]: I1203 14:49:02.629607 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:49:02 crc kubenswrapper[5004]: E1203 14:49:02.632100 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:49:15 crc kubenswrapper[5004]: I1203 14:49:15.613382 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:49:15 crc kubenswrapper[5004]: E1203 14:49:15.614382 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:49:26 crc kubenswrapper[5004]: I1203 14:49:26.613648 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:49:26 crc kubenswrapper[5004]: E1203 14:49:26.614911 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:49:38 crc kubenswrapper[5004]: I1203 14:49:38.613329 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:49:38 crc kubenswrapper[5004]: E1203 14:49:38.614196 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:49:49 crc kubenswrapper[5004]: I1203 14:49:49.613366 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:49:49 crc kubenswrapper[5004]: E1203 14:49:49.614898 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:50:04 crc kubenswrapper[5004]: I1203 14:50:04.613627 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:50:04 crc kubenswrapper[5004]: E1203 14:50:04.614493 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.012528 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.015103 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.039308 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.129771 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.130454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62rm\" (UniqueName: \"kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.130518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.232648 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k62rm\" (UniqueName: \"kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.232709 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.232805 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.233299 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.233394 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.252402 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62rm\" (UniqueName: \"kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm\") pod \"community-operators-vmz22\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.352841 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:06 crc kubenswrapper[5004]: I1203 14:50:06.858607 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:07 crc kubenswrapper[5004]: I1203 14:50:07.834485 5004 generic.go:334] "Generic (PLEG): container finished" podID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerID="267bfa0152f777c0321be95b2e1dd031071dbfccfc03c971ef10503f6b8a9b6d" exitCode=0 Dec 03 14:50:07 crc kubenswrapper[5004]: I1203 14:50:07.834527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerDied","Data":"267bfa0152f777c0321be95b2e1dd031071dbfccfc03c971ef10503f6b8a9b6d"} Dec 03 14:50:07 crc kubenswrapper[5004]: I1203 14:50:07.834816 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerStarted","Data":"1428b7f2502d24d58606c2b81be7b3e9c0ec61fc097a53ba54686a46dcc639e0"} Dec 03 14:50:08 crc kubenswrapper[5004]: I1203 14:50:08.847220 5004 generic.go:334] "Generic (PLEG): container finished" podID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerID="a4f110a04eb1e6cae6ee0027952612574d96cff54d6dfd0f9e22a349319f519b" exitCode=0 Dec 03 14:50:08 crc kubenswrapper[5004]: I1203 14:50:08.847281 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerDied","Data":"a4f110a04eb1e6cae6ee0027952612574d96cff54d6dfd0f9e22a349319f519b"} Dec 03 14:50:09 crc kubenswrapper[5004]: I1203 14:50:09.860361 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerStarted","Data":"40b74fd42bad6454918fafee902dd56ad291391bad2b07a3b0188002805c68ae"} Dec 03 14:50:09 crc kubenswrapper[5004]: I1203 14:50:09.896681 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmz22" podStartSLOduration=3.348869079 podStartE2EDuration="4.896657837s" podCreationTimestamp="2025-12-03 14:50:05 +0000 UTC" firstStartedPulling="2025-12-03 14:50:07.836842335 +0000 UTC m=+2620.585812571" lastFinishedPulling="2025-12-03 14:50:09.384631093 +0000 UTC m=+2622.133601329" observedRunningTime="2025-12-03 14:50:09.884796437 +0000 UTC m=+2622.633766673" watchObservedRunningTime="2025-12-03 14:50:09.896657837 +0000 UTC m=+2622.645628073" Dec 03 14:50:16 crc kubenswrapper[5004]: I1203 14:50:16.353258 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:16 crc kubenswrapper[5004]: I1203 14:50:16.353736 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:16 crc kubenswrapper[5004]: I1203 14:50:16.420353 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:16 crc kubenswrapper[5004]: I1203 14:50:16.613266 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:50:16 crc kubenswrapper[5004]: E1203 14:50:16.613498 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:50:16 crc kubenswrapper[5004]: I1203 14:50:16.961048 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:17 crc kubenswrapper[5004]: I1203 14:50:17.007245 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:18 crc kubenswrapper[5004]: I1203 14:50:18.934914 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vmz22" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="registry-server" containerID="cri-o://40b74fd42bad6454918fafee902dd56ad291391bad2b07a3b0188002805c68ae" gracePeriod=2 Dec 03 14:50:19 crc kubenswrapper[5004]: I1203 14:50:19.946775 5004 generic.go:334] "Generic (PLEG): container finished" podID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerID="40b74fd42bad6454918fafee902dd56ad291391bad2b07a3b0188002805c68ae" exitCode=0 Dec 03 14:50:19 crc kubenswrapper[5004]: I1203 14:50:19.946827 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerDied","Data":"40b74fd42bad6454918fafee902dd56ad291391bad2b07a3b0188002805c68ae"} Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.591201 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.702222 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content\") pod \"5ffaedb3-e894-4b84-ac21-3adeda97783f\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.702368 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k62rm\" (UniqueName: \"kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm\") pod \"5ffaedb3-e894-4b84-ac21-3adeda97783f\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.702404 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities\") pod \"5ffaedb3-e894-4b84-ac21-3adeda97783f\" (UID: \"5ffaedb3-e894-4b84-ac21-3adeda97783f\") " Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.703507 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities" (OuterVolumeSpecName: "utilities") pod "5ffaedb3-e894-4b84-ac21-3adeda97783f" (UID: "5ffaedb3-e894-4b84-ac21-3adeda97783f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.708872 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm" (OuterVolumeSpecName: "kube-api-access-k62rm") pod "5ffaedb3-e894-4b84-ac21-3adeda97783f" (UID: "5ffaedb3-e894-4b84-ac21-3adeda97783f"). InnerVolumeSpecName "kube-api-access-k62rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.754490 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ffaedb3-e894-4b84-ac21-3adeda97783f" (UID: "5ffaedb3-e894-4b84-ac21-3adeda97783f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.804394 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.804430 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k62rm\" (UniqueName: \"kubernetes.io/projected/5ffaedb3-e894-4b84-ac21-3adeda97783f-kube-api-access-k62rm\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.804442 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ffaedb3-e894-4b84-ac21-3adeda97783f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.965016 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmz22" event={"ID":"5ffaedb3-e894-4b84-ac21-3adeda97783f","Type":"ContainerDied","Data":"1428b7f2502d24d58606c2b81be7b3e9c0ec61fc097a53ba54686a46dcc639e0"} Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.965088 5004 scope.go:117] "RemoveContainer" containerID="40b74fd42bad6454918fafee902dd56ad291391bad2b07a3b0188002805c68ae" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.965173 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmz22" Dec 03 14:50:20 crc kubenswrapper[5004]: I1203 14:50:20.985392 5004 scope.go:117] "RemoveContainer" containerID="a4f110a04eb1e6cae6ee0027952612574d96cff54d6dfd0f9e22a349319f519b" Dec 03 14:50:21 crc kubenswrapper[5004]: I1203 14:50:21.006367 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:21 crc kubenswrapper[5004]: I1203 14:50:21.018129 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vmz22"] Dec 03 14:50:21 crc kubenswrapper[5004]: I1203 14:50:21.026917 5004 scope.go:117] "RemoveContainer" containerID="267bfa0152f777c0321be95b2e1dd031071dbfccfc03c971ef10503f6b8a9b6d" Dec 03 14:50:21 crc kubenswrapper[5004]: I1203 14:50:21.626610 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" path="/var/lib/kubelet/pods/5ffaedb3-e894-4b84-ac21-3adeda97783f/volumes" Dec 03 14:50:25 crc kubenswrapper[5004]: I1203 14:50:25.002961 5004 generic.go:334] "Generic (PLEG): container finished" podID="32a75e28-35af-4a42-ae5c-ac1a24ba78ee" containerID="b1023164279ce7ae1337b5917ce064453f1371b8274ad9d044f32430ea1757e4" exitCode=0 Dec 03 14:50:25 crc kubenswrapper[5004]: I1203 14:50:25.003076 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" event={"ID":"32a75e28-35af-4a42-ae5c-ac1a24ba78ee","Type":"ContainerDied","Data":"b1023164279ce7ae1337b5917ce064453f1371b8274ad9d044f32430ea1757e4"} Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.428623 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.625996 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626038 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626150 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7zpj\" (UniqueName: \"kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626178 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626198 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626234 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626832 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.626952 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1\") pod \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\" (UID: \"32a75e28-35af-4a42-ae5c-ac1a24ba78ee\") " Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.631824 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.651047 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj" (OuterVolumeSpecName: "kube-api-access-j7zpj") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "kube-api-access-j7zpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.657574 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.657591 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.660887 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.662980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.663986 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.671242 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory" (OuterVolumeSpecName: "inventory") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.683144 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "32a75e28-35af-4a42-ae5c-ac1a24ba78ee" (UID: "32a75e28-35af-4a42-ae5c-ac1a24ba78ee"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729484 5004 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729517 5004 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729528 5004 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729537 5004 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729546 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729554 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7zpj\" (UniqueName: \"kubernetes.io/projected/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-kube-api-access-j7zpj\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729562 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729570 5004 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:26 crc kubenswrapper[5004]: I1203 14:50:26.729579 5004 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a75e28-35af-4a42-ae5c-ac1a24ba78ee-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.025591 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" event={"ID":"32a75e28-35af-4a42-ae5c-ac1a24ba78ee","Type":"ContainerDied","Data":"724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5"} Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.025886 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="724821a7f8c1e886bbd6d10b3b976933c7dc661ffe7dbafc9df2b473d1cc5ad5" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.025679 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ffvrz" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.126575 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r"] Dec 03 14:50:27 crc kubenswrapper[5004]: E1203 14:50:27.127060 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="registry-server" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127082 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="registry-server" Dec 03 14:50:27 crc kubenswrapper[5004]: E1203 14:50:27.127096 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a75e28-35af-4a42-ae5c-ac1a24ba78ee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127104 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a75e28-35af-4a42-ae5c-ac1a24ba78ee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:27 crc kubenswrapper[5004]: E1203 14:50:27.127120 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="extract-utilities" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127128 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="extract-utilities" Dec 03 14:50:27 crc kubenswrapper[5004]: E1203 14:50:27.127142 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="extract-content" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127150 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="extract-content" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127379 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffaedb3-e894-4b84-ac21-3adeda97783f" containerName="registry-server" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.127418 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a75e28-35af-4a42-ae5c-ac1a24ba78ee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.128203 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.131307 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.132099 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.132637 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.133220 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.133651 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ks4dw" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.136681 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r"] Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.248783 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249121 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249370 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249505 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249613 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249728 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.249912 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5z5q\" (UniqueName: \"kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350505 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5z5q\" (UniqueName: \"kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350579 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350649 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350674 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350706 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.350742 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.357449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.357613 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.358502 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.359430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.359993 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.360356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.379503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5z5q\" (UniqueName: \"kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xx47r\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:27 crc kubenswrapper[5004]: I1203 14:50:27.452158 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:50:28 crc kubenswrapper[5004]: I1203 14:50:27.995126 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r"] Dec 03 14:50:28 crc kubenswrapper[5004]: I1203 14:50:28.035815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" event={"ID":"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f","Type":"ContainerStarted","Data":"86952839a830ee6eefac398ef14e93a2d8d802104361fb5abf6bb471a78b4d2b"} Dec 03 14:50:28 crc kubenswrapper[5004]: I1203 14:50:28.501173 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 14:50:29 crc kubenswrapper[5004]: I1203 14:50:29.045419 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" event={"ID":"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f","Type":"ContainerStarted","Data":"10990cd6889d784a454a27648744535ba78d2f1adde8978a348779caea779121"} Dec 03 14:50:29 crc kubenswrapper[5004]: I1203 14:50:29.063343 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" podStartSLOduration=1.578963694 podStartE2EDuration="2.063313673s" podCreationTimestamp="2025-12-03 14:50:27 +0000 UTC" firstStartedPulling="2025-12-03 14:50:28.008337147 +0000 UTC m=+2640.757307383" lastFinishedPulling="2025-12-03 14:50:28.492687126 +0000 UTC m=+2641.241657362" observedRunningTime="2025-12-03 14:50:29.059934726 +0000 UTC m=+2641.808904962" watchObservedRunningTime="2025-12-03 14:50:29.063313673 +0000 UTC m=+2641.812283909" Dec 03 14:50:30 crc kubenswrapper[5004]: I1203 14:50:30.613568 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:50:30 crc kubenswrapper[5004]: E1203 14:50:30.614197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:50:42 crc kubenswrapper[5004]: I1203 14:50:42.620956 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:50:42 crc kubenswrapper[5004]: E1203 14:50:42.622755 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:50:53 crc kubenswrapper[5004]: I1203 14:50:53.613137 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:50:53 crc kubenswrapper[5004]: E1203 14:50:53.614020 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:51:06 crc kubenswrapper[5004]: I1203 14:51:06.612657 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:51:06 crc kubenswrapper[5004]: E1203 14:51:06.613463 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:51:20 crc kubenswrapper[5004]: I1203 14:51:20.613315 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:51:20 crc kubenswrapper[5004]: E1203 14:51:20.615045 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:51:31 crc kubenswrapper[5004]: I1203 14:51:31.614482 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:51:32 crc kubenswrapper[5004]: I1203 14:51:32.586750 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472"} Dec 03 14:52:39 crc kubenswrapper[5004]: I1203 14:52:39.181180 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" containerID="10990cd6889d784a454a27648744535ba78d2f1adde8978a348779caea779121" exitCode=0 Dec 03 14:52:39 crc kubenswrapper[5004]: I1203 14:52:39.181302 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" event={"ID":"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f","Type":"ContainerDied","Data":"10990cd6889d784a454a27648744535ba78d2f1adde8978a348779caea779121"} Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.661325 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.694819 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695372 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695527 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695577 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695619 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5z5q\" (UniqueName: \"kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695779 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.695895 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2\") pod \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\" (UID: \"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f\") " Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.707059 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q" (OuterVolumeSpecName: "kube-api-access-s5z5q") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "kube-api-access-s5z5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.707477 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.731255 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.736207 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.736592 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.740470 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.751004 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory" (OuterVolumeSpecName: "inventory") pod "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" (UID: "cf32991a-bf4f-4ce6-9d01-3b75e2108b9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798172 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798332 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798399 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798453 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798537 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5z5q\" (UniqueName: \"kubernetes.io/projected/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-kube-api-access-s5z5q\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798594 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:40 crc kubenswrapper[5004]: I1203 14:52:40.798653 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cf32991a-bf4f-4ce6-9d01-3b75e2108b9f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 14:52:41 crc kubenswrapper[5004]: I1203 14:52:41.200959 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" event={"ID":"cf32991a-bf4f-4ce6-9d01-3b75e2108b9f","Type":"ContainerDied","Data":"86952839a830ee6eefac398ef14e93a2d8d802104361fb5abf6bb471a78b4d2b"} Dec 03 14:52:41 crc kubenswrapper[5004]: I1203 14:52:41.201012 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86952839a830ee6eefac398ef14e93a2d8d802104361fb5abf6bb471a78b4d2b" Dec 03 14:52:41 crc kubenswrapper[5004]: I1203 14:52:41.201047 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xx47r" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.106389 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:52:47 crc kubenswrapper[5004]: E1203 14:52:47.107323 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.107338 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.107550 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf32991a-bf4f-4ce6-9d01-3b75e2108b9f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.108958 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.124970 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.227501 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.227548 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.227609 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pm8\" (UniqueName: \"kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.329440 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.329827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.329908 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pm8\" (UniqueName: \"kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.330164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.330635 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.351962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pm8\" (UniqueName: \"kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8\") pod \"redhat-marketplace-fw2fm\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.429631 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:47 crc kubenswrapper[5004]: I1203 14:52:47.970309 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:52:48 crc kubenswrapper[5004]: I1203 14:52:48.265465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerStarted","Data":"3dbfde094dbd9c7de6ec73739a78a074078bd1e928c3ad3657b7b1463bc93ca0"} Dec 03 14:52:48 crc kubenswrapper[5004]: I1203 14:52:48.904742 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:52:48 crc kubenswrapper[5004]: I1203 14:52:48.907691 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:48 crc kubenswrapper[5004]: I1203 14:52:48.926047 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.068411 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.068510 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.068551 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwt9\" (UniqueName: \"kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.170801 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.171231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.171282 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwt9\" (UniqueName: \"kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.171437 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.171763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.193545 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwt9\" (UniqueName: \"kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9\") pod \"redhat-operators-s9q5g\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.278436 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerID="f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2" exitCode=0 Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.278501 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerDied","Data":"f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2"} Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.281619 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:52:49 crc kubenswrapper[5004]: I1203 14:52:49.694276 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.290971 5004 generic.go:334] "Generic (PLEG): container finished" podID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerID="3fe154a383df502bfddc133cb3c554c6cdf9565a14dcb80ff1e8e61354d38f12" exitCode=0 Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.291029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerDied","Data":"3fe154a383df502bfddc133cb3c554c6cdf9565a14dcb80ff1e8e61354d38f12"} Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.291063 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerStarted","Data":"a13e520f775cbb157ed878a59c79e766dbe1a46b5f7f43854f26291432380058"} Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.308515 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.312783 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.318210 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.426596 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q798l\" (UniqueName: \"kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.426648 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.426673 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.529079 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q798l\" (UniqueName: \"kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.529132 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.529169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.529781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.529821 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.552785 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q798l\" (UniqueName: \"kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l\") pod \"certified-operators-nd6wz\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:50 crc kubenswrapper[5004]: I1203 14:52:50.632304 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:52:51 crc kubenswrapper[5004]: I1203 14:52:51.189574 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:52:51 crc kubenswrapper[5004]: I1203 14:52:51.333783 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerID="a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248" exitCode=0 Dec 03 14:52:51 crc kubenswrapper[5004]: I1203 14:52:51.334726 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerDied","Data":"a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248"} Dec 03 14:52:51 crc kubenswrapper[5004]: I1203 14:52:51.340772 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerStarted","Data":"e7a123cbc22307f848ac1b582eeed6ca311ac96234ce43cc797da0676e28f7c9"} Dec 03 14:52:51 crc kubenswrapper[5004]: I1203 14:52:51.342033 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 14:52:52 crc kubenswrapper[5004]: I1203 14:52:52.352593 5004 generic.go:334] "Generic (PLEG): container finished" podID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerID="122bc9606c0ada5ab7a51023d9819de7bf632c05016f7bd5e912316dcdc5a7b5" exitCode=0 Dec 03 14:52:52 crc kubenswrapper[5004]: I1203 14:52:52.352657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerDied","Data":"122bc9606c0ada5ab7a51023d9819de7bf632c05016f7bd5e912316dcdc5a7b5"} Dec 03 14:52:52 crc kubenswrapper[5004]: I1203 14:52:52.365311 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerStarted","Data":"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749"} Dec 03 14:52:52 crc kubenswrapper[5004]: I1203 14:52:52.400012 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fw2fm" podStartSLOduration=2.634436726 podStartE2EDuration="5.399988433s" podCreationTimestamp="2025-12-03 14:52:47 +0000 UTC" firstStartedPulling="2025-12-03 14:52:49.28027079 +0000 UTC m=+2782.029241026" lastFinishedPulling="2025-12-03 14:52:52.045822497 +0000 UTC m=+2784.794792733" observedRunningTime="2025-12-03 14:52:52.396368099 +0000 UTC m=+2785.145338335" watchObservedRunningTime="2025-12-03 14:52:52.399988433 +0000 UTC m=+2785.148958669" Dec 03 14:52:53 crc kubenswrapper[5004]: I1203 14:52:53.377033 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerStarted","Data":"bc04099d7e854c74c2bc8a60e00d7096041b662f9705209fe0450a3a032f3652"} Dec 03 14:52:54 crc kubenswrapper[5004]: I1203 14:52:54.418044 5004 generic.go:334] "Generic (PLEG): container finished" podID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerID="bc04099d7e854c74c2bc8a60e00d7096041b662f9705209fe0450a3a032f3652" exitCode=0 Dec 03 14:52:54 crc kubenswrapper[5004]: I1203 14:52:54.418089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerDied","Data":"bc04099d7e854c74c2bc8a60e00d7096041b662f9705209fe0450a3a032f3652"} Dec 03 14:52:57 crc kubenswrapper[5004]: I1203 14:52:57.430609 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:57 crc kubenswrapper[5004]: I1203 14:52:57.431092 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:57 crc kubenswrapper[5004]: I1203 14:52:57.463685 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerStarted","Data":"49d37ff82e112f43b8fd31d5930216cfa495a6386ac858827ced3aec79e3a4ab"} Dec 03 14:52:57 crc kubenswrapper[5004]: I1203 14:52:57.486851 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:57 crc kubenswrapper[5004]: I1203 14:52:57.553566 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:52:58 crc kubenswrapper[5004]: I1203 14:52:58.472922 5004 generic.go:334] "Generic (PLEG): container finished" podID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerID="49d37ff82e112f43b8fd31d5930216cfa495a6386ac858827ced3aec79e3a4ab" exitCode=0 Dec 03 14:52:58 crc kubenswrapper[5004]: I1203 14:52:58.473064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerDied","Data":"49d37ff82e112f43b8fd31d5930216cfa495a6386ac858827ced3aec79e3a4ab"} Dec 03 14:52:59 crc kubenswrapper[5004]: I1203 14:52:59.094393 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:52:59 crc kubenswrapper[5004]: I1203 14:52:59.503435 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fw2fm" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="registry-server" containerID="cri-o://5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749" gracePeriod=2 Dec 03 14:52:59 crc kubenswrapper[5004]: I1203 14:52:59.503874 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerStarted","Data":"48f8aa3229e4aa0636ca5e375b0d3f1069d8d78af44f12b8fe891934e0990f46"} Dec 03 14:52:59 crc kubenswrapper[5004]: I1203 14:52:59.538957 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s9q5g" podStartSLOduration=5.356583825 podStartE2EDuration="11.538933601s" podCreationTimestamp="2025-12-03 14:52:48 +0000 UTC" firstStartedPulling="2025-12-03 14:52:51.342668883 +0000 UTC m=+2784.091639119" lastFinishedPulling="2025-12-03 14:52:57.525018649 +0000 UTC m=+2790.273988895" observedRunningTime="2025-12-03 14:52:59.53088531 +0000 UTC m=+2792.279855556" watchObservedRunningTime="2025-12-03 14:52:59.538933601 +0000 UTC m=+2792.287903837" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.182830 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.236090 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content\") pod \"4e002c65-d541-4fe8-ab60-9a338d693b16\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.236218 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities\") pod \"4e002c65-d541-4fe8-ab60-9a338d693b16\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.236307 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4pm8\" (UniqueName: \"kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8\") pod \"4e002c65-d541-4fe8-ab60-9a338d693b16\" (UID: \"4e002c65-d541-4fe8-ab60-9a338d693b16\") " Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.237452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities" (OuterVolumeSpecName: "utilities") pod "4e002c65-d541-4fe8-ab60-9a338d693b16" (UID: "4e002c65-d541-4fe8-ab60-9a338d693b16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.243351 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8" (OuterVolumeSpecName: "kube-api-access-g4pm8") pod "4e002c65-d541-4fe8-ab60-9a338d693b16" (UID: "4e002c65-d541-4fe8-ab60-9a338d693b16"). InnerVolumeSpecName "kube-api-access-g4pm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.271324 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e002c65-d541-4fe8-ab60-9a338d693b16" (UID: "4e002c65-d541-4fe8-ab60-9a338d693b16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.338258 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.338321 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e002c65-d541-4fe8-ab60-9a338d693b16-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.338333 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4pm8\" (UniqueName: \"kubernetes.io/projected/4e002c65-d541-4fe8-ab60-9a338d693b16-kube-api-access-g4pm8\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.521375 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerStarted","Data":"ff3d22c5376c4770697b615819bdaebc73f57babbe48118030b0b4b3d2301477"} Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.526383 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerID="5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749" exitCode=0 Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.526432 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerDied","Data":"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749"} Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.526465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw2fm" event={"ID":"4e002c65-d541-4fe8-ab60-9a338d693b16","Type":"ContainerDied","Data":"3dbfde094dbd9c7de6ec73739a78a074078bd1e928c3ad3657b7b1463bc93ca0"} Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.526464 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw2fm" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.526488 5004 scope.go:117] "RemoveContainer" containerID="5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.552826 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nd6wz" podStartSLOduration=3.687507914 podStartE2EDuration="10.552807074s" podCreationTimestamp="2025-12-03 14:52:50 +0000 UTC" firstStartedPulling="2025-12-03 14:52:52.355509428 +0000 UTC m=+2785.104479664" lastFinishedPulling="2025-12-03 14:52:59.220808578 +0000 UTC m=+2791.969778824" observedRunningTime="2025-12-03 14:53:00.541416358 +0000 UTC m=+2793.290386594" watchObservedRunningTime="2025-12-03 14:53:00.552807074 +0000 UTC m=+2793.301777310" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.564533 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.566429 5004 scope.go:117] "RemoveContainer" containerID="a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.573367 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw2fm"] Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.586771 5004 scope.go:117] "RemoveContainer" containerID="f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.632550 5004 scope.go:117] "RemoveContainer" containerID="5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.632573 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.632606 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:00 crc kubenswrapper[5004]: E1203 14:53:00.633036 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749\": container with ID starting with 5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749 not found: ID does not exist" containerID="5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.633072 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749"} err="failed to get container status \"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749\": rpc error: code = NotFound desc = could not find container \"5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749\": container with ID starting with 5b3e1a6ad38bba390b8c3c73226f3f9d3e09ec3be0903cc1f2d448440ce43749 not found: ID does not exist" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.633098 5004 scope.go:117] "RemoveContainer" containerID="a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248" Dec 03 14:53:00 crc kubenswrapper[5004]: E1203 14:53:00.633431 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248\": container with ID starting with a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248 not found: ID does not exist" containerID="a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.633455 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248"} err="failed to get container status \"a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248\": rpc error: code = NotFound desc = could not find container \"a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248\": container with ID starting with a018a2af6e10e7620d9e26d763de0635cd16ee7dd05b4130a0f64698d6d9c248 not found: ID does not exist" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.633471 5004 scope.go:117] "RemoveContainer" containerID="f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2" Dec 03 14:53:00 crc kubenswrapper[5004]: E1203 14:53:00.633778 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2\": container with ID starting with f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2 not found: ID does not exist" containerID="f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2" Dec 03 14:53:00 crc kubenswrapper[5004]: I1203 14:53:00.633843 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2"} err="failed to get container status \"f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2\": rpc error: code = NotFound desc = could not find container \"f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2\": container with ID starting with f8b77b45935b88c1957300946b3de9f67b066375e108f5cee337b7458662d0a2 not found: ID does not exist" Dec 03 14:53:01 crc kubenswrapper[5004]: I1203 14:53:01.622828 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" path="/var/lib/kubelet/pods/4e002c65-d541-4fe8-ab60-9a338d693b16/volumes" Dec 03 14:53:01 crc kubenswrapper[5004]: I1203 14:53:01.697995 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nd6wz" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="registry-server" probeResult="failure" output=< Dec 03 14:53:01 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 03 14:53:01 crc kubenswrapper[5004]: > Dec 03 14:53:09 crc kubenswrapper[5004]: I1203 14:53:09.282384 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:09 crc kubenswrapper[5004]: I1203 14:53:09.284013 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:09 crc kubenswrapper[5004]: I1203 14:53:09.336229 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:09 crc kubenswrapper[5004]: I1203 14:53:09.691984 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:09 crc kubenswrapper[5004]: I1203 14:53:09.740765 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:53:10 crc kubenswrapper[5004]: I1203 14:53:10.684431 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:10 crc kubenswrapper[5004]: I1203 14:53:10.739826 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:11 crc kubenswrapper[5004]: I1203 14:53:11.654130 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s9q5g" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="registry-server" containerID="cri-o://48f8aa3229e4aa0636ca5e375b0d3f1069d8d78af44f12b8fe891934e0990f46" gracePeriod=2 Dec 03 14:53:11 crc kubenswrapper[5004]: I1203 14:53:11.973712 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.670970 5004 generic.go:334] "Generic (PLEG): container finished" podID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerID="48f8aa3229e4aa0636ca5e375b0d3f1069d8d78af44f12b8fe891934e0990f46" exitCode=0 Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.671062 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerDied","Data":"48f8aa3229e4aa0636ca5e375b0d3f1069d8d78af44f12b8fe891934e0990f46"} Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.671221 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s9q5g" event={"ID":"e18ea843-f13f-4d3b-8b7c-d5461bb50363","Type":"ContainerDied","Data":"a13e520f775cbb157ed878a59c79e766dbe1a46b5f7f43854f26291432380058"} Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.671240 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13e520f775cbb157ed878a59c79e766dbe1a46b5f7f43854f26291432380058" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.671358 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nd6wz" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="registry-server" containerID="cri-o://ff3d22c5376c4770697b615819bdaebc73f57babbe48118030b0b4b3d2301477" gracePeriod=2 Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.691949 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.801367 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content\") pod \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.801500 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities\") pod \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.801527 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcwt9\" (UniqueName: \"kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9\") pod \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\" (UID: \"e18ea843-f13f-4d3b-8b7c-d5461bb50363\") " Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.802293 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities" (OuterVolumeSpecName: "utilities") pod "e18ea843-f13f-4d3b-8b7c-d5461bb50363" (UID: "e18ea843-f13f-4d3b-8b7c-d5461bb50363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.807304 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9" (OuterVolumeSpecName: "kube-api-access-wcwt9") pod "e18ea843-f13f-4d3b-8b7c-d5461bb50363" (UID: "e18ea843-f13f-4d3b-8b7c-d5461bb50363"). InnerVolumeSpecName "kube-api-access-wcwt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.903977 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.904024 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcwt9\" (UniqueName: \"kubernetes.io/projected/e18ea843-f13f-4d3b-8b7c-d5461bb50363-kube-api-access-wcwt9\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:12 crc kubenswrapper[5004]: I1203 14:53:12.912454 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e18ea843-f13f-4d3b-8b7c-d5461bb50363" (UID: "e18ea843-f13f-4d3b-8b7c-d5461bb50363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.005307 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18ea843-f13f-4d3b-8b7c-d5461bb50363-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.684155 5004 generic.go:334] "Generic (PLEG): container finished" podID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerID="ff3d22c5376c4770697b615819bdaebc73f57babbe48118030b0b4b3d2301477" exitCode=0 Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.684381 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerDied","Data":"ff3d22c5376c4770697b615819bdaebc73f57babbe48118030b0b4b3d2301477"} Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.684466 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6wz" event={"ID":"e525d346-65e3-43d1-a4d3-d696bb0f69b5","Type":"ContainerDied","Data":"e7a123cbc22307f848ac1b582eeed6ca311ac96234ce43cc797da0676e28f7c9"} Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.684475 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s9q5g" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.684484 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7a123cbc22307f848ac1b582eeed6ca311ac96234ce43cc797da0676e28f7c9" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.737561 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.764580 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.774191 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s9q5g"] Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.821451 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content\") pod \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.821649 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q798l\" (UniqueName: \"kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l\") pod \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.821727 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities\") pod \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\" (UID: \"e525d346-65e3-43d1-a4d3-d696bb0f69b5\") " Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.822731 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities" (OuterVolumeSpecName: "utilities") pod "e525d346-65e3-43d1-a4d3-d696bb0f69b5" (UID: "e525d346-65e3-43d1-a4d3-d696bb0f69b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.825880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l" (OuterVolumeSpecName: "kube-api-access-q798l") pod "e525d346-65e3-43d1-a4d3-d696bb0f69b5" (UID: "e525d346-65e3-43d1-a4d3-d696bb0f69b5"). InnerVolumeSpecName "kube-api-access-q798l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.878742 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e525d346-65e3-43d1-a4d3-d696bb0f69b5" (UID: "e525d346-65e3-43d1-a4d3-d696bb0f69b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.923737 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.923795 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q798l\" (UniqueName: \"kubernetes.io/projected/e525d346-65e3-43d1-a4d3-d696bb0f69b5-kube-api-access-q798l\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:13 crc kubenswrapper[5004]: I1203 14:53:13.923810 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e525d346-65e3-43d1-a4d3-d696bb0f69b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 14:53:14 crc kubenswrapper[5004]: I1203 14:53:14.693790 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6wz" Dec 03 14:53:14 crc kubenswrapper[5004]: I1203 14:53:14.730997 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:53:14 crc kubenswrapper[5004]: I1203 14:53:14.739405 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nd6wz"] Dec 03 14:53:15 crc kubenswrapper[5004]: I1203 14:53:15.625442 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" path="/var/lib/kubelet/pods/e18ea843-f13f-4d3b-8b7c-d5461bb50363/volumes" Dec 03 14:53:15 crc kubenswrapper[5004]: I1203 14:53:15.626418 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" path="/var/lib/kubelet/pods/e525d346-65e3-43d1-a4d3-d696bb0f69b5/volumes" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.731736 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.733817 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.733935 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734036 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734104 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734164 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734224 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734292 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734347 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734409 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734470 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734530 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734586 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734647 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734702 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734762 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734819 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="extract-content" Dec 03 14:53:38 crc kubenswrapper[5004]: E1203 14:53:38.734903 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.734979 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="extract-utilities" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.735226 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18ea843-f13f-4d3b-8b7c-d5461bb50363" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.735308 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e002c65-d541-4fe8-ab60-9a338d693b16" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.735369 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e525d346-65e3-43d1-a4d3-d696bb0f69b5" containerName="registry-server" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.736178 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.738523 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.738676 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6dldj" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.738907 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.744010 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.769433 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812017 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812115 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812137 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812271 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812328 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812362 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbr2\" (UniqueName: \"kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.812391 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914541 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914631 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914677 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914736 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914797 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914841 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914922 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbr2\" (UniqueName: \"kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.914974 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.915230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.915666 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.916599 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.916921 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.917091 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.917225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.923902 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.927925 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.930325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.934480 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbr2\" (UniqueName: \"kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:38 crc kubenswrapper[5004]: I1203 14:53:38.966024 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " pod="openstack/tempest-tests-tempest" Dec 03 14:53:39 crc kubenswrapper[5004]: I1203 14:53:39.068422 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 14:53:39 crc kubenswrapper[5004]: I1203 14:53:39.561985 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 14:53:39 crc kubenswrapper[5004]: I1203 14:53:39.935083 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"be771b30-f62b-4d18-977a-2c0d6ecca56a","Type":"ContainerStarted","Data":"870764504e96c0641b9001f32c7df3c0210528d04c1bbed0a47ee8a299855cfe"} Dec 03 14:53:43 crc kubenswrapper[5004]: I1203 14:53:43.729078 5004 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pode18ea843-f13f-4d3b-8b7c-d5461bb50363"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pode18ea843-f13f-4d3b-8b7c-d5461bb50363] : Timed out while waiting for systemd to remove kubepods-burstable-pode18ea843_f13f_4d3b_8b7c_d5461bb50363.slice" Dec 03 14:53:52 crc kubenswrapper[5004]: I1203 14:53:52.824795 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:53:52 crc kubenswrapper[5004]: I1203 14:53:52.825496 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:54:08 crc kubenswrapper[5004]: E1203 14:54:08.756537 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 14:54:08 crc kubenswrapper[5004]: E1203 14:54:08.760198 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrbr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(be771b30-f62b-4d18-977a-2c0d6ecca56a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 14:54:08 crc kubenswrapper[5004]: E1203 14:54:08.761489 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="be771b30-f62b-4d18-977a-2c0d6ecca56a" Dec 03 14:54:09 crc kubenswrapper[5004]: E1203 14:54:09.239424 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="be771b30-f62b-4d18-977a-2c0d6ecca56a" Dec 03 14:54:22 crc kubenswrapper[5004]: I1203 14:54:22.824505 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:54:22 crc kubenswrapper[5004]: I1203 14:54:22.825040 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:54:26 crc kubenswrapper[5004]: I1203 14:54:26.434273 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"be771b30-f62b-4d18-977a-2c0d6ecca56a","Type":"ContainerStarted","Data":"d3f36bce23b86e1409d8eb029f83014c9e12251b563d8811c05c75e9253d2610"} Dec 03 14:54:26 crc kubenswrapper[5004]: I1203 14:54:26.465157 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.992889319 podStartE2EDuration="49.465139121s" podCreationTimestamp="2025-12-03 14:53:37 +0000 UTC" firstStartedPulling="2025-12-03 14:53:39.567522556 +0000 UTC m=+2832.316492792" lastFinishedPulling="2025-12-03 14:54:25.039772358 +0000 UTC m=+2877.788742594" observedRunningTime="2025-12-03 14:54:26.455059882 +0000 UTC m=+2879.204030128" watchObservedRunningTime="2025-12-03 14:54:26.465139121 +0000 UTC m=+2879.214109357" Dec 03 14:54:52 crc kubenswrapper[5004]: I1203 14:54:52.824465 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:54:52 crc kubenswrapper[5004]: I1203 14:54:52.825017 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:54:52 crc kubenswrapper[5004]: I1203 14:54:52.825060 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:54:52 crc kubenswrapper[5004]: I1203 14:54:52.825732 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:54:52 crc kubenswrapper[5004]: I1203 14:54:52.825912 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472" gracePeriod=600 Dec 03 14:54:53 crc kubenswrapper[5004]: I1203 14:54:53.659846 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472" exitCode=0 Dec 03 14:54:53 crc kubenswrapper[5004]: I1203 14:54:53.659895 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472"} Dec 03 14:54:53 crc kubenswrapper[5004]: I1203 14:54:53.660530 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419"} Dec 03 14:54:53 crc kubenswrapper[5004]: I1203 14:54:53.660556 5004 scope.go:117] "RemoveContainer" containerID="2c3cdacd433151be2b526cc1c0350c8100d9370ae1dd79abb4683217f36c55e8" Dec 03 14:57:22 crc kubenswrapper[5004]: I1203 14:57:22.824772 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:57:22 crc kubenswrapper[5004]: I1203 14:57:22.825544 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:57:52 crc kubenswrapper[5004]: I1203 14:57:52.824918 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:57:52 crc kubenswrapper[5004]: I1203 14:57:52.825511 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.824478 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.824941 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.824985 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.825692 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.825745 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" gracePeriod=600 Dec 03 14:58:22 crc kubenswrapper[5004]: E1203 14:58:22.964818 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.986711 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" exitCode=0 Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.986762 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419"} Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.987122 5004 scope.go:117] "RemoveContainer" containerID="ae765e4d5ab6f65ae5a19112bda334ecc704233e5b290d8b14f50184e937e472" Dec 03 14:58:22 crc kubenswrapper[5004]: I1203 14:58:22.987884 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:58:22 crc kubenswrapper[5004]: E1203 14:58:22.988271 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:58:38 crc kubenswrapper[5004]: I1203 14:58:38.612711 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:58:38 crc kubenswrapper[5004]: E1203 14:58:38.613584 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:58:50 crc kubenswrapper[5004]: I1203 14:58:50.613162 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:58:50 crc kubenswrapper[5004]: E1203 14:58:50.614328 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:59:04 crc kubenswrapper[5004]: I1203 14:59:04.614270 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:59:04 crc kubenswrapper[5004]: E1203 14:59:04.617079 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:59:19 crc kubenswrapper[5004]: I1203 14:59:19.613107 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:59:19 crc kubenswrapper[5004]: E1203 14:59:19.614118 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:59:34 crc kubenswrapper[5004]: I1203 14:59:34.613749 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:59:34 crc kubenswrapper[5004]: E1203 14:59:34.614690 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:59:37 crc kubenswrapper[5004]: I1203 14:59:37.966723 5004 scope.go:117] "RemoveContainer" containerID="3fe154a383df502bfddc133cb3c554c6cdf9565a14dcb80ff1e8e61354d38f12" Dec 03 14:59:37 crc kubenswrapper[5004]: I1203 14:59:37.987662 5004 scope.go:117] "RemoveContainer" containerID="48f8aa3229e4aa0636ca5e375b0d3f1069d8d78af44f12b8fe891934e0990f46" Dec 03 14:59:38 crc kubenswrapper[5004]: I1203 14:59:38.030389 5004 scope.go:117] "RemoveContainer" containerID="122bc9606c0ada5ab7a51023d9819de7bf632c05016f7bd5e912316dcdc5a7b5" Dec 03 14:59:38 crc kubenswrapper[5004]: I1203 14:59:38.053406 5004 scope.go:117] "RemoveContainer" containerID="ff3d22c5376c4770697b615819bdaebc73f57babbe48118030b0b4b3d2301477" Dec 03 14:59:38 crc kubenswrapper[5004]: I1203 14:59:38.101256 5004 scope.go:117] "RemoveContainer" containerID="49d37ff82e112f43b8fd31d5930216cfa495a6386ac858827ced3aec79e3a4ab" Dec 03 14:59:38 crc kubenswrapper[5004]: I1203 14:59:38.124811 5004 scope.go:117] "RemoveContainer" containerID="bc04099d7e854c74c2bc8a60e00d7096041b662f9705209fe0450a3a032f3652" Dec 03 14:59:47 crc kubenswrapper[5004]: I1203 14:59:47.618442 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:59:47 crc kubenswrapper[5004]: E1203 14:59:47.619304 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 14:59:59 crc kubenswrapper[5004]: I1203 14:59:59.612849 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 14:59:59 crc kubenswrapper[5004]: E1203 14:59:59.613667 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.158972 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq"] Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.160505 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.164554 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.164621 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.167689 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq"] Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.228708 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsn28\" (UniqueName: \"kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.228798 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.228968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.331036 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.331431 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsn28\" (UniqueName: \"kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.331934 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.332349 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.338509 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.349384 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsn28\" (UniqueName: \"kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28\") pod \"collect-profiles-29412900-tbwhq\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.495685 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:00 crc kubenswrapper[5004]: I1203 15:00:00.958987 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq"] Dec 03 15:00:01 crc kubenswrapper[5004]: I1203 15:00:01.977806 5004 generic.go:334] "Generic (PLEG): container finished" podID="5050a816-4329-47b1-802f-d1f71e373d23" containerID="4c4b237e88f9f6c1dfc01d923b9fe20d79c4e72f6e39eb1b0029021c437f6ff5" exitCode=0 Dec 03 15:00:01 crc kubenswrapper[5004]: I1203 15:00:01.977939 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" event={"ID":"5050a816-4329-47b1-802f-d1f71e373d23","Type":"ContainerDied","Data":"4c4b237e88f9f6c1dfc01d923b9fe20d79c4e72f6e39eb1b0029021c437f6ff5"} Dec 03 15:00:01 crc kubenswrapper[5004]: I1203 15:00:01.978186 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" event={"ID":"5050a816-4329-47b1-802f-d1f71e373d23","Type":"ContainerStarted","Data":"23a13eb83ff4b6dad81bbb8a2414cd8cd6ca20432a67d1d5aa9e5541f79827cb"} Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.360674 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.399163 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume\") pod \"5050a816-4329-47b1-802f-d1f71e373d23\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.399256 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume\") pod \"5050a816-4329-47b1-802f-d1f71e373d23\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.399366 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsn28\" (UniqueName: \"kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28\") pod \"5050a816-4329-47b1-802f-d1f71e373d23\" (UID: \"5050a816-4329-47b1-802f-d1f71e373d23\") " Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.400306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume" (OuterVolumeSpecName: "config-volume") pod "5050a816-4329-47b1-802f-d1f71e373d23" (UID: "5050a816-4329-47b1-802f-d1f71e373d23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.405495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5050a816-4329-47b1-802f-d1f71e373d23" (UID: "5050a816-4329-47b1-802f-d1f71e373d23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.406547 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28" (OuterVolumeSpecName: "kube-api-access-qsn28") pod "5050a816-4329-47b1-802f-d1f71e373d23" (UID: "5050a816-4329-47b1-802f-d1f71e373d23"). InnerVolumeSpecName "kube-api-access-qsn28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.500943 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5050a816-4329-47b1-802f-d1f71e373d23-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.500982 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsn28\" (UniqueName: \"kubernetes.io/projected/5050a816-4329-47b1-802f-d1f71e373d23-kube-api-access-qsn28\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.500995 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5050a816-4329-47b1-802f-d1f71e373d23-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.997057 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" event={"ID":"5050a816-4329-47b1-802f-d1f71e373d23","Type":"ContainerDied","Data":"23a13eb83ff4b6dad81bbb8a2414cd8cd6ca20432a67d1d5aa9e5541f79827cb"} Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.997105 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a13eb83ff4b6dad81bbb8a2414cd8cd6ca20432a67d1d5aa9e5541f79827cb" Dec 03 15:00:03 crc kubenswrapper[5004]: I1203 15:00:03.997105 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412900-tbwhq" Dec 03 15:00:04 crc kubenswrapper[5004]: I1203 15:00:04.459385 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4"] Dec 03 15:00:04 crc kubenswrapper[5004]: I1203 15:00:04.467866 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412855-gc7c4"] Dec 03 15:00:05 crc kubenswrapper[5004]: I1203 15:00:05.623447 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadc310e-caab-475e-9900-c376fd4f5371" path="/var/lib/kubelet/pods/aadc310e-caab-475e-9900-c376fd4f5371/volumes" Dec 03 15:00:12 crc kubenswrapper[5004]: I1203 15:00:12.613364 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:00:12 crc kubenswrapper[5004]: E1203 15:00:12.614197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:00:26 crc kubenswrapper[5004]: I1203 15:00:26.613442 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:00:26 crc kubenswrapper[5004]: E1203 15:00:26.614347 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:00:38 crc kubenswrapper[5004]: I1203 15:00:38.231009 5004 scope.go:117] "RemoveContainer" containerID="a207fe752e9aa222fa7939c9c7e70388018ea970ee762f607fc5dd498b0ddbb8" Dec 03 15:00:41 crc kubenswrapper[5004]: I1203 15:00:41.613022 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:00:41 crc kubenswrapper[5004]: E1203 15:00:41.613919 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:00:55 crc kubenswrapper[5004]: I1203 15:00:55.613189 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:00:55 crc kubenswrapper[5004]: E1203 15:00:55.614435 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.164807 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412901-5c859"] Dec 03 15:01:00 crc kubenswrapper[5004]: E1203 15:01:00.165797 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5050a816-4329-47b1-802f-d1f71e373d23" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.165810 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5050a816-4329-47b1-802f-d1f71e373d23" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.166044 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5050a816-4329-47b1-802f-d1f71e373d23" containerName="collect-profiles" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.166697 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.176709 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412901-5c859"] Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.234557 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.234643 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.234757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2lt\" (UniqueName: \"kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.234813 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.336279 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.336434 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2lt\" (UniqueName: \"kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.336509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.336545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.342584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.342638 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.346019 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.359240 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2lt\" (UniqueName: \"kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt\") pod \"keystone-cron-29412901-5c859\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.505723 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:00 crc kubenswrapper[5004]: I1203 15:01:00.956693 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412901-5c859"] Dec 03 15:01:01 crc kubenswrapper[5004]: I1203 15:01:01.532609 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-5c859" event={"ID":"ad605cec-0786-4e4e-a1f2-9626a34e39c8","Type":"ContainerStarted","Data":"7e3c2e4960883e0ac3ae48c313d7e62558cf15a83d38370b5592afbaeb6d50a2"} Dec 03 15:01:01 crc kubenswrapper[5004]: I1203 15:01:01.532961 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-5c859" event={"ID":"ad605cec-0786-4e4e-a1f2-9626a34e39c8","Type":"ContainerStarted","Data":"bc7fcf9329b3fc2652eab49b3adffdd202c36767871b5531c1a29d862be97666"} Dec 03 15:01:01 crc kubenswrapper[5004]: I1203 15:01:01.563204 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412901-5c859" podStartSLOduration=1.5631880489999999 podStartE2EDuration="1.563188049s" podCreationTimestamp="2025-12-03 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:01:01.557793404 +0000 UTC m=+3274.306763650" watchObservedRunningTime="2025-12-03 15:01:01.563188049 +0000 UTC m=+3274.312158285" Dec 03 15:01:03 crc kubenswrapper[5004]: I1203 15:01:03.554044 5004 generic.go:334] "Generic (PLEG): container finished" podID="ad605cec-0786-4e4e-a1f2-9626a34e39c8" containerID="7e3c2e4960883e0ac3ae48c313d7e62558cf15a83d38370b5592afbaeb6d50a2" exitCode=0 Dec 03 15:01:03 crc kubenswrapper[5004]: I1203 15:01:03.554104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-5c859" event={"ID":"ad605cec-0786-4e4e-a1f2-9626a34e39c8","Type":"ContainerDied","Data":"7e3c2e4960883e0ac3ae48c313d7e62558cf15a83d38370b5592afbaeb6d50a2"} Dec 03 15:01:04 crc kubenswrapper[5004]: I1203 15:01:04.933289 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.050704 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle\") pod \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.050898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys\") pod \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.051066 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data\") pod \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.051229 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d2lt\" (UniqueName: \"kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt\") pod \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\" (UID: \"ad605cec-0786-4e4e-a1f2-9626a34e39c8\") " Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.058196 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ad605cec-0786-4e4e-a1f2-9626a34e39c8" (UID: "ad605cec-0786-4e4e-a1f2-9626a34e39c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.060231 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt" (OuterVolumeSpecName: "kube-api-access-4d2lt") pod "ad605cec-0786-4e4e-a1f2-9626a34e39c8" (UID: "ad605cec-0786-4e4e-a1f2-9626a34e39c8"). InnerVolumeSpecName "kube-api-access-4d2lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.104077 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data" (OuterVolumeSpecName: "config-data") pod "ad605cec-0786-4e4e-a1f2-9626a34e39c8" (UID: "ad605cec-0786-4e4e-a1f2-9626a34e39c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.107529 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad605cec-0786-4e4e-a1f2-9626a34e39c8" (UID: "ad605cec-0786-4e4e-a1f2-9626a34e39c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.154245 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d2lt\" (UniqueName: \"kubernetes.io/projected/ad605cec-0786-4e4e-a1f2-9626a34e39c8-kube-api-access-4d2lt\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.154304 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.154324 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.154342 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad605cec-0786-4e4e-a1f2-9626a34e39c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.579918 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412901-5c859" event={"ID":"ad605cec-0786-4e4e-a1f2-9626a34e39c8","Type":"ContainerDied","Data":"bc7fcf9329b3fc2652eab49b3adffdd202c36767871b5531c1a29d862be97666"} Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.579959 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7fcf9329b3fc2652eab49b3adffdd202c36767871b5531c1a29d862be97666" Dec 03 15:01:05 crc kubenswrapper[5004]: I1203 15:01:05.579989 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412901-5c859" Dec 03 15:01:08 crc kubenswrapper[5004]: I1203 15:01:08.614207 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:01:08 crc kubenswrapper[5004]: E1203 15:01:08.614838 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:01:23 crc kubenswrapper[5004]: I1203 15:01:23.613338 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:01:23 crc kubenswrapper[5004]: E1203 15:01:23.614115 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.916029 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:25 crc kubenswrapper[5004]: E1203 15:01:25.916551 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad605cec-0786-4e4e-a1f2-9626a34e39c8" containerName="keystone-cron" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.916569 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad605cec-0786-4e4e-a1f2-9626a34e39c8" containerName="keystone-cron" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.916812 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad605cec-0786-4e4e-a1f2-9626a34e39c8" containerName="keystone-cron" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.918617 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.930565 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.958259 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2dv\" (UniqueName: \"kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.958318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:25 crc kubenswrapper[5004]: I1203 15:01:25.958375 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.060809 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2dv\" (UniqueName: \"kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.060925 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.061014 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.061529 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.061612 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.084797 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2dv\" (UniqueName: \"kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv\") pod \"community-operators-dwrdv\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.253367 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:26 crc kubenswrapper[5004]: I1203 15:01:26.856183 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:27 crc kubenswrapper[5004]: I1203 15:01:27.806005 5004 generic.go:334] "Generic (PLEG): container finished" podID="82794a2c-f6c4-46f8-8483-788442a59596" containerID="2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6" exitCode=0 Dec 03 15:01:27 crc kubenswrapper[5004]: I1203 15:01:27.806071 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerDied","Data":"2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6"} Dec 03 15:01:27 crc kubenswrapper[5004]: I1203 15:01:27.806714 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerStarted","Data":"6f5d9de43a767f9cea9214e8020d6fc38b4c12d3dec9f555cd8f3c286a47eacf"} Dec 03 15:01:27 crc kubenswrapper[5004]: I1203 15:01:27.810277 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:01:28 crc kubenswrapper[5004]: I1203 15:01:28.818555 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerStarted","Data":"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e"} Dec 03 15:01:29 crc kubenswrapper[5004]: I1203 15:01:29.833980 5004 generic.go:334] "Generic (PLEG): container finished" podID="82794a2c-f6c4-46f8-8483-788442a59596" containerID="37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e" exitCode=0 Dec 03 15:01:29 crc kubenswrapper[5004]: I1203 15:01:29.834052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerDied","Data":"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e"} Dec 03 15:01:30 crc kubenswrapper[5004]: I1203 15:01:30.847945 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerStarted","Data":"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db"} Dec 03 15:01:30 crc kubenswrapper[5004]: I1203 15:01:30.869340 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dwrdv" podStartSLOduration=3.432149003 podStartE2EDuration="5.869318688s" podCreationTimestamp="2025-12-03 15:01:25 +0000 UTC" firstStartedPulling="2025-12-03 15:01:27.81008666 +0000 UTC m=+3300.559056896" lastFinishedPulling="2025-12-03 15:01:30.247256315 +0000 UTC m=+3302.996226581" observedRunningTime="2025-12-03 15:01:30.861652048 +0000 UTC m=+3303.610622294" watchObservedRunningTime="2025-12-03 15:01:30.869318688 +0000 UTC m=+3303.618288944" Dec 03 15:01:36 crc kubenswrapper[5004]: I1203 15:01:36.254373 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:36 crc kubenswrapper[5004]: I1203 15:01:36.255089 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:36 crc kubenswrapper[5004]: I1203 15:01:36.304674 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:36 crc kubenswrapper[5004]: I1203 15:01:36.613818 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:01:36 crc kubenswrapper[5004]: E1203 15:01:36.614087 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:01:36 crc kubenswrapper[5004]: I1203 15:01:36.955778 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:37 crc kubenswrapper[5004]: I1203 15:01:37.005778 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:38 crc kubenswrapper[5004]: I1203 15:01:38.924950 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dwrdv" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="registry-server" containerID="cri-o://7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db" gracePeriod=2 Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.519791 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.639319 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities\") pod \"82794a2c-f6c4-46f8-8483-788442a59596\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.639436 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content\") pod \"82794a2c-f6c4-46f8-8483-788442a59596\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.639557 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk2dv\" (UniqueName: \"kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv\") pod \"82794a2c-f6c4-46f8-8483-788442a59596\" (UID: \"82794a2c-f6c4-46f8-8483-788442a59596\") " Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.640440 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities" (OuterVolumeSpecName: "utilities") pod "82794a2c-f6c4-46f8-8483-788442a59596" (UID: "82794a2c-f6c4-46f8-8483-788442a59596"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.648086 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv" (OuterVolumeSpecName: "kube-api-access-qk2dv") pod "82794a2c-f6c4-46f8-8483-788442a59596" (UID: "82794a2c-f6c4-46f8-8483-788442a59596"). InnerVolumeSpecName "kube-api-access-qk2dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.704213 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82794a2c-f6c4-46f8-8483-788442a59596" (UID: "82794a2c-f6c4-46f8-8483-788442a59596"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.741438 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.741499 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82794a2c-f6c4-46f8-8483-788442a59596-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.741516 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk2dv\" (UniqueName: \"kubernetes.io/projected/82794a2c-f6c4-46f8-8483-788442a59596-kube-api-access-qk2dv\") on node \"crc\" DevicePath \"\"" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.937898 5004 generic.go:334] "Generic (PLEG): container finished" podID="82794a2c-f6c4-46f8-8483-788442a59596" containerID="7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db" exitCode=0 Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.937945 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerDied","Data":"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db"} Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.937976 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwrdv" event={"ID":"82794a2c-f6c4-46f8-8483-788442a59596","Type":"ContainerDied","Data":"6f5d9de43a767f9cea9214e8020d6fc38b4c12d3dec9f555cd8f3c286a47eacf"} Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.937995 5004 scope.go:117] "RemoveContainer" containerID="7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.938021 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwrdv" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.975319 5004 scope.go:117] "RemoveContainer" containerID="37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e" Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.979534 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:39 crc kubenswrapper[5004]: I1203 15:01:39.991924 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dwrdv"] Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.010375 5004 scope.go:117] "RemoveContainer" containerID="2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.056351 5004 scope.go:117] "RemoveContainer" containerID="7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db" Dec 03 15:01:40 crc kubenswrapper[5004]: E1203 15:01:40.056836 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db\": container with ID starting with 7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db not found: ID does not exist" containerID="7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.056897 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db"} err="failed to get container status \"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db\": rpc error: code = NotFound desc = could not find container \"7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db\": container with ID starting with 7b9535860d0cb39340e0ef568da1ca6fe2d6158f9171df2eda4046ec75f263db not found: ID does not exist" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.056928 5004 scope.go:117] "RemoveContainer" containerID="37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e" Dec 03 15:01:40 crc kubenswrapper[5004]: E1203 15:01:40.057544 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e\": container with ID starting with 37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e not found: ID does not exist" containerID="37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.057581 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e"} err="failed to get container status \"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e\": rpc error: code = NotFound desc = could not find container \"37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e\": container with ID starting with 37bcd53098d16dd4b1ebcae01fc296dc2c592457a7310025a92dbef87103547e not found: ID does not exist" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.057609 5004 scope.go:117] "RemoveContainer" containerID="2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6" Dec 03 15:01:40 crc kubenswrapper[5004]: E1203 15:01:40.058027 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6\": container with ID starting with 2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6 not found: ID does not exist" containerID="2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6" Dec 03 15:01:40 crc kubenswrapper[5004]: I1203 15:01:40.058055 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6"} err="failed to get container status \"2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6\": rpc error: code = NotFound desc = could not find container \"2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6\": container with ID starting with 2fad17b3c7f235ad04b4e298d1deed3bdfbf3f8c2faa85769d1c5fc4c95854c6 not found: ID does not exist" Dec 03 15:01:41 crc kubenswrapper[5004]: I1203 15:01:41.622478 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82794a2c-f6c4-46f8-8483-788442a59596" path="/var/lib/kubelet/pods/82794a2c-f6c4-46f8-8483-788442a59596/volumes" Dec 03 15:01:49 crc kubenswrapper[5004]: I1203 15:01:49.613501 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:01:49 crc kubenswrapper[5004]: E1203 15:01:49.614511 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:02:04 crc kubenswrapper[5004]: I1203 15:02:04.613416 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:02:04 crc kubenswrapper[5004]: E1203 15:02:04.614144 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:02:19 crc kubenswrapper[5004]: I1203 15:02:19.614927 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:02:19 crc kubenswrapper[5004]: E1203 15:02:19.615833 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:02:33 crc kubenswrapper[5004]: I1203 15:02:33.613294 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:02:33 crc kubenswrapper[5004]: E1203 15:02:33.614168 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:02:47 crc kubenswrapper[5004]: I1203 15:02:47.619360 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:02:47 crc kubenswrapper[5004]: E1203 15:02:47.620910 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:03:02 crc kubenswrapper[5004]: I1203 15:03:02.612589 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:03:02 crc kubenswrapper[5004]: E1203 15:03:02.613443 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:03:14 crc kubenswrapper[5004]: I1203 15:03:14.612796 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:03:14 crc kubenswrapper[5004]: E1203 15:03:14.613637 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:03:25 crc kubenswrapper[5004]: I1203 15:03:25.613567 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:03:26 crc kubenswrapper[5004]: I1203 15:03:26.044924 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06"} Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.038421 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:30 crc kubenswrapper[5004]: E1203 15:03:30.039502 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="extract-content" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.039529 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="extract-content" Dec 03 15:03:30 crc kubenswrapper[5004]: E1203 15:03:30.039563 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="extract-utilities" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.039576 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="extract-utilities" Dec 03 15:03:30 crc kubenswrapper[5004]: E1203 15:03:30.039616 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="registry-server" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.039624 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="registry-server" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.039941 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="82794a2c-f6c4-46f8-8483-788442a59596" containerName="registry-server" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.041725 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.050557 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.126176 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.126233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.126304 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftczh\" (UniqueName: \"kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.228323 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.228661 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftczh\" (UniqueName: \"kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.228875 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.228873 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.229083 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.250903 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftczh\" (UniqueName: \"kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh\") pod \"redhat-operators-2dzp4\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.372448 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:30 crc kubenswrapper[5004]: I1203 15:03:30.850289 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:31 crc kubenswrapper[5004]: I1203 15:03:31.091470 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerStarted","Data":"d9b7f73e3e442d98f013a7e527c021ffecf11c2c85c0529f0ded882ba642e207"} Dec 03 15:03:32 crc kubenswrapper[5004]: I1203 15:03:32.102292 5004 generic.go:334] "Generic (PLEG): container finished" podID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerID="e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a" exitCode=0 Dec 03 15:03:32 crc kubenswrapper[5004]: I1203 15:03:32.102478 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerDied","Data":"e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a"} Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.132516 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerStarted","Data":"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc"} Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.614789 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.617449 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.630940 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.710572 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.710772 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.711002 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbrkm\" (UniqueName: \"kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.812652 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbrkm\" (UniqueName: \"kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.812787 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.812883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.813407 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.813614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.845143 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbrkm\" (UniqueName: \"kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm\") pod \"certified-operators-xl5b6\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:34 crc kubenswrapper[5004]: I1203 15:03:34.945613 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:35 crc kubenswrapper[5004]: I1203 15:03:35.648707 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:03:36 crc kubenswrapper[5004]: I1203 15:03:36.154765 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerStarted","Data":"e0800935e68d0d90de7d8e60282d942345f051fa9f935bc092262de7509cab25"} Dec 03 15:03:37 crc kubenswrapper[5004]: I1203 15:03:37.166683 5004 generic.go:334] "Generic (PLEG): container finished" podID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerID="c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc" exitCode=0 Dec 03 15:03:37 crc kubenswrapper[5004]: I1203 15:03:37.166770 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerDied","Data":"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc"} Dec 03 15:03:40 crc kubenswrapper[5004]: I1203 15:03:40.195376 5004 generic.go:334] "Generic (PLEG): container finished" podID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerID="da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19" exitCode=0 Dec 03 15:03:40 crc kubenswrapper[5004]: I1203 15:03:40.195443 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerDied","Data":"da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19"} Dec 03 15:03:41 crc kubenswrapper[5004]: I1203 15:03:41.204705 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerStarted","Data":"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c"} Dec 03 15:03:41 crc kubenswrapper[5004]: I1203 15:03:41.207259 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerStarted","Data":"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad"} Dec 03 15:03:41 crc kubenswrapper[5004]: I1203 15:03:41.254113 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2dzp4" podStartSLOduration=3.328635467 podStartE2EDuration="11.254089063s" podCreationTimestamp="2025-12-03 15:03:30 +0000 UTC" firstStartedPulling="2025-12-03 15:03:32.105438063 +0000 UTC m=+3424.854408299" lastFinishedPulling="2025-12-03 15:03:40.030891659 +0000 UTC m=+3432.779861895" observedRunningTime="2025-12-03 15:03:41.244811377 +0000 UTC m=+3433.993781613" watchObservedRunningTime="2025-12-03 15:03:41.254089063 +0000 UTC m=+3434.003059299" Dec 03 15:03:43 crc kubenswrapper[5004]: I1203 15:03:43.231556 5004 generic.go:334] "Generic (PLEG): container finished" podID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerID="24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad" exitCode=0 Dec 03 15:03:43 crc kubenswrapper[5004]: I1203 15:03:43.231671 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerDied","Data":"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad"} Dec 03 15:03:50 crc kubenswrapper[5004]: I1203 15:03:50.373823 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:50 crc kubenswrapper[5004]: I1203 15:03:50.374537 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:50 crc kubenswrapper[5004]: I1203 15:03:50.423943 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:51 crc kubenswrapper[5004]: I1203 15:03:51.368257 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:51 crc kubenswrapper[5004]: I1203 15:03:51.418334 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.325101 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2dzp4" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="registry-server" containerID="cri-o://9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c" gracePeriod=2 Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.851633 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.912482 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content\") pod \"9a6cfb23-d879-440d-8b64-13e9230f31bc\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.912572 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftczh\" (UniqueName: \"kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh\") pod \"9a6cfb23-d879-440d-8b64-13e9230f31bc\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.912722 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities\") pod \"9a6cfb23-d879-440d-8b64-13e9230f31bc\" (UID: \"9a6cfb23-d879-440d-8b64-13e9230f31bc\") " Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.913476 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities" (OuterVolumeSpecName: "utilities") pod "9a6cfb23-d879-440d-8b64-13e9230f31bc" (UID: "9a6cfb23-d879-440d-8b64-13e9230f31bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:03:53 crc kubenswrapper[5004]: I1203 15:03:53.920647 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh" (OuterVolumeSpecName: "kube-api-access-ftczh") pod "9a6cfb23-d879-440d-8b64-13e9230f31bc" (UID: "9a6cfb23-d879-440d-8b64-13e9230f31bc"). InnerVolumeSpecName "kube-api-access-ftczh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.015140 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftczh\" (UniqueName: \"kubernetes.io/projected/9a6cfb23-d879-440d-8b64-13e9230f31bc-kube-api-access-ftczh\") on node \"crc\" DevicePath \"\"" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.015175 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.027404 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a6cfb23-d879-440d-8b64-13e9230f31bc" (UID: "9a6cfb23-d879-440d-8b64-13e9230f31bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.116819 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6cfb23-d879-440d-8b64-13e9230f31bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.335828 5004 generic.go:334] "Generic (PLEG): container finished" podID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerID="9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c" exitCode=0 Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.335859 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerDied","Data":"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c"} Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.335979 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzp4" event={"ID":"9a6cfb23-d879-440d-8b64-13e9230f31bc","Type":"ContainerDied","Data":"d9b7f73e3e442d98f013a7e527c021ffecf11c2c85c0529f0ded882ba642e207"} Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.335997 5004 scope.go:117] "RemoveContainer" containerID="9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.336118 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzp4" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.338501 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerStarted","Data":"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d"} Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.367617 5004 scope.go:117] "RemoveContainer" containerID="c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.375805 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xl5b6" podStartSLOduration=6.895871495 podStartE2EDuration="20.375788048s" podCreationTimestamp="2025-12-03 15:03:34 +0000 UTC" firstStartedPulling="2025-12-03 15:03:40.196794205 +0000 UTC m=+3432.945764441" lastFinishedPulling="2025-12-03 15:03:53.676710758 +0000 UTC m=+3446.425680994" observedRunningTime="2025-12-03 15:03:54.361260032 +0000 UTC m=+3447.110230268" watchObservedRunningTime="2025-12-03 15:03:54.375788048 +0000 UTC m=+3447.124758284" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.390531 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.397957 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2dzp4"] Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.425516 5004 scope.go:117] "RemoveContainer" containerID="e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.480354 5004 scope.go:117] "RemoveContainer" containerID="9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c" Dec 03 15:03:54 crc kubenswrapper[5004]: E1203 15:03:54.481097 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c\": container with ID starting with 9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c not found: ID does not exist" containerID="9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.481145 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c"} err="failed to get container status \"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c\": rpc error: code = NotFound desc = could not find container \"9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c\": container with ID starting with 9b95622d5edb73e545831c29cdd0bff269f911fc244cddae76019fc852f7487c not found: ID does not exist" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.481174 5004 scope.go:117] "RemoveContainer" containerID="c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc" Dec 03 15:03:54 crc kubenswrapper[5004]: E1203 15:03:54.482017 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc\": container with ID starting with c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc not found: ID does not exist" containerID="c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.482050 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc"} err="failed to get container status \"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc\": rpc error: code = NotFound desc = could not find container \"c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc\": container with ID starting with c106955251276cbf118ac0ae9f4339c5ec84b34bd3ecdfddfef86ecc98fb36dc not found: ID does not exist" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.482072 5004 scope.go:117] "RemoveContainer" containerID="e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a" Dec 03 15:03:54 crc kubenswrapper[5004]: E1203 15:03:54.482437 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a\": container with ID starting with e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a not found: ID does not exist" containerID="e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.482536 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a"} err="failed to get container status \"e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a\": rpc error: code = NotFound desc = could not find container \"e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a\": container with ID starting with e1d08e1deda8232e943b6fc7544d55c44915ea6db75969074555c732e2c9957a not found: ID does not exist" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.946639 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:54 crc kubenswrapper[5004]: I1203 15:03:54.946692 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:03:55 crc kubenswrapper[5004]: I1203 15:03:55.623356 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" path="/var/lib/kubelet/pods/9a6cfb23-d879-440d-8b64-13e9230f31bc/volumes" Dec 03 15:03:55 crc kubenswrapper[5004]: I1203 15:03:55.999364 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xl5b6" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="registry-server" probeResult="failure" output=< Dec 03 15:03:55 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 03 15:03:55 crc kubenswrapper[5004]: > Dec 03 15:04:05 crc kubenswrapper[5004]: I1203 15:04:05.048600 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:04:05 crc kubenswrapper[5004]: I1203 15:04:05.124629 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:04:05 crc kubenswrapper[5004]: I1203 15:04:05.819249 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:04:06 crc kubenswrapper[5004]: I1203 15:04:06.451786 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xl5b6" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="registry-server" containerID="cri-o://c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d" gracePeriod=2 Dec 03 15:04:06 crc kubenswrapper[5004]: I1203 15:04:06.998610 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.112334 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbrkm\" (UniqueName: \"kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm\") pod \"2867c9dc-088a-4761-8ecb-28c37329ebe9\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.112386 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities\") pod \"2867c9dc-088a-4761-8ecb-28c37329ebe9\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.112502 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content\") pod \"2867c9dc-088a-4761-8ecb-28c37329ebe9\" (UID: \"2867c9dc-088a-4761-8ecb-28c37329ebe9\") " Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.113393 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities" (OuterVolumeSpecName: "utilities") pod "2867c9dc-088a-4761-8ecb-28c37329ebe9" (UID: "2867c9dc-088a-4761-8ecb-28c37329ebe9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.120443 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm" (OuterVolumeSpecName: "kube-api-access-fbrkm") pod "2867c9dc-088a-4761-8ecb-28c37329ebe9" (UID: "2867c9dc-088a-4761-8ecb-28c37329ebe9"). InnerVolumeSpecName "kube-api-access-fbrkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.182336 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2867c9dc-088a-4761-8ecb-28c37329ebe9" (UID: "2867c9dc-088a-4761-8ecb-28c37329ebe9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.214788 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.214830 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbrkm\" (UniqueName: \"kubernetes.io/projected/2867c9dc-088a-4761-8ecb-28c37329ebe9-kube-api-access-fbrkm\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.214844 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2867c9dc-088a-4761-8ecb-28c37329ebe9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.468370 5004 generic.go:334] "Generic (PLEG): container finished" podID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerID="c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d" exitCode=0 Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.468446 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl5b6" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.468446 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerDied","Data":"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d"} Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.468815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl5b6" event={"ID":"2867c9dc-088a-4761-8ecb-28c37329ebe9","Type":"ContainerDied","Data":"e0800935e68d0d90de7d8e60282d942345f051fa9f935bc092262de7509cab25"} Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.468872 5004 scope.go:117] "RemoveContainer" containerID="c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.509697 5004 scope.go:117] "RemoveContainer" containerID="24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.513899 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.524779 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xl5b6"] Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.556987 5004 scope.go:117] "RemoveContainer" containerID="da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.585381 5004 scope.go:117] "RemoveContainer" containerID="c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d" Dec 03 15:04:07 crc kubenswrapper[5004]: E1203 15:04:07.587467 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d\": container with ID starting with c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d not found: ID does not exist" containerID="c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.587511 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d"} err="failed to get container status \"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d\": rpc error: code = NotFound desc = could not find container \"c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d\": container with ID starting with c53cc027600611998415cfb56243b75fc84775c237f7cdf1f7f41188b60ecb1d not found: ID does not exist" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.587539 5004 scope.go:117] "RemoveContainer" containerID="24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad" Dec 03 15:04:07 crc kubenswrapper[5004]: E1203 15:04:07.588236 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad\": container with ID starting with 24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad not found: ID does not exist" containerID="24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.588281 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad"} err="failed to get container status \"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad\": rpc error: code = NotFound desc = could not find container \"24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad\": container with ID starting with 24fc7c47becb10e8389800090560dac5394e706257f9b84ce9456bd43921b8ad not found: ID does not exist" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.588310 5004 scope.go:117] "RemoveContainer" containerID="da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19" Dec 03 15:04:07 crc kubenswrapper[5004]: E1203 15:04:07.588703 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19\": container with ID starting with da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19 not found: ID does not exist" containerID="da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.588757 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19"} err="failed to get container status \"da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19\": rpc error: code = NotFound desc = could not find container \"da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19\": container with ID starting with da7162616d62acf89c509295c09f2168e0b715539417199bc58043cb6d973f19 not found: ID does not exist" Dec 03 15:04:07 crc kubenswrapper[5004]: I1203 15:04:07.622632 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" path="/var/lib/kubelet/pods/2867c9dc-088a-4761-8ecb-28c37329ebe9/volumes" Dec 03 15:05:12 crc kubenswrapper[5004]: I1203 15:05:12.238131 5004 generic.go:334] "Generic (PLEG): container finished" podID="be771b30-f62b-4d18-977a-2c0d6ecca56a" containerID="d3f36bce23b86e1409d8eb029f83014c9e12251b563d8811c05c75e9253d2610" exitCode=0 Dec 03 15:05:12 crc kubenswrapper[5004]: I1203 15:05:12.238195 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"be771b30-f62b-4d18-977a-2c0d6ecca56a","Type":"ContainerDied","Data":"d3f36bce23b86e1409d8eb029f83014c9e12251b563d8811c05c75e9253d2610"} Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.631744 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781132 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrbr2\" (UniqueName: \"kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781198 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781299 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781366 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781401 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781466 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781547 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.781592 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret\") pod \"be771b30-f62b-4d18-977a-2c0d6ecca56a\" (UID: \"be771b30-f62b-4d18-977a-2c0d6ecca56a\") " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.782729 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.783010 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data" (OuterVolumeSpecName: "config-data") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.786755 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.787268 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2" (OuterVolumeSpecName: "kube-api-access-rrbr2") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "kube-api-access-rrbr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.787457 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.812179 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.813306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.819467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.851220 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "be771b30-f62b-4d18-977a-2c0d6ecca56a" (UID: "be771b30-f62b-4d18-977a-2c0d6ecca56a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884283 5004 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884314 5004 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884329 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884441 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884479 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884495 5004 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/be771b30-f62b-4d18-977a-2c0d6ecca56a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884506 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884517 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrbr2\" (UniqueName: \"kubernetes.io/projected/be771b30-f62b-4d18-977a-2c0d6ecca56a-kube-api-access-rrbr2\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.884527 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be771b30-f62b-4d18-977a-2c0d6ecca56a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.913711 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 15:05:13 crc kubenswrapper[5004]: I1203 15:05:13.986935 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 15:05:14 crc kubenswrapper[5004]: I1203 15:05:14.262412 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"be771b30-f62b-4d18-977a-2c0d6ecca56a","Type":"ContainerDied","Data":"870764504e96c0641b9001f32c7df3c0210528d04c1bbed0a47ee8a299855cfe"} Dec 03 15:05:14 crc kubenswrapper[5004]: I1203 15:05:14.263145 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870764504e96c0641b9001f32c7df3c0210528d04c1bbed0a47ee8a299855cfe" Dec 03 15:05:14 crc kubenswrapper[5004]: I1203 15:05:14.262491 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.308690 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309793 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="extract-content" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309813 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="extract-content" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309833 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="extract-utilities" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309843 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="extract-utilities" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309879 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be771b30-f62b-4d18-977a-2c0d6ecca56a" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309888 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="be771b30-f62b-4d18-977a-2c0d6ecca56a" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309912 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309921 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309936 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="extract-utilities" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309945 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="extract-utilities" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.309970 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.309978 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: E1203 15:05:18.310000 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="extract-content" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.310008 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="extract-content" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.310237 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6cfb23-d879-440d-8b64-13e9230f31bc" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.310264 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="be771b30-f62b-4d18-977a-2c0d6ecca56a" containerName="tempest-tests-tempest-tests-runner" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.310289 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2867c9dc-088a-4761-8ecb-28c37329ebe9" containerName="registry-server" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.311186 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.314750 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6dldj" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.322207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.499061 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.499354 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzckc\" (UniqueName: \"kubernetes.io/projected/b8fbccb0-338d-4e17-915a-25d07c3491c9-kube-api-access-vzckc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.601583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.601728 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzckc\" (UniqueName: \"kubernetes.io/projected/b8fbccb0-338d-4e17-915a-25d07c3491c9-kube-api-access-vzckc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.602774 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.638517 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzckc\" (UniqueName: \"kubernetes.io/projected/b8fbccb0-338d-4e17-915a-25d07c3491c9-kube-api-access-vzckc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.639535 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8fbccb0-338d-4e17-915a-25d07c3491c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:18 crc kubenswrapper[5004]: I1203 15:05:18.652657 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 15:05:19 crc kubenswrapper[5004]: I1203 15:05:19.183466 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 15:05:19 crc kubenswrapper[5004]: I1203 15:05:19.322612 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8fbccb0-338d-4e17-915a-25d07c3491c9","Type":"ContainerStarted","Data":"54cd12a3ef24cdb2b60bcd1248406e97a074b00676651b9d4aaeb0dbbb524473"} Dec 03 15:05:24 crc kubenswrapper[5004]: I1203 15:05:24.382702 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8fbccb0-338d-4e17-915a-25d07c3491c9","Type":"ContainerStarted","Data":"8f3492f8f5c5601d88f3ed5f985f5555de4a7ca0eeb73b15561d30f13d6d27a0"} Dec 03 15:05:24 crc kubenswrapper[5004]: I1203 15:05:24.410518 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.489194142 podStartE2EDuration="6.410496383s" podCreationTimestamp="2025-12-03 15:05:18 +0000 UTC" firstStartedPulling="2025-12-03 15:05:19.18864848 +0000 UTC m=+3531.937618716" lastFinishedPulling="2025-12-03 15:05:23.109950701 +0000 UTC m=+3535.858920957" observedRunningTime="2025-12-03 15:05:24.402212926 +0000 UTC m=+3537.151183162" watchObservedRunningTime="2025-12-03 15:05:24.410496383 +0000 UTC m=+3537.159466619" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.531796 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2gq7/must-gather-ppljs"] Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.534261 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.537519 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p2gq7"/"openshift-service-ca.crt" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.537784 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p2gq7"/"kube-root-ca.crt" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.556514 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2gq7/must-gather-ppljs"] Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.699083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.699160 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hc9l\" (UniqueName: \"kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.801252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.801340 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hc9l\" (UniqueName: \"kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.801721 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.819051 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hc9l\" (UniqueName: \"kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l\") pod \"must-gather-ppljs\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:49 crc kubenswrapper[5004]: I1203 15:05:49.855520 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:05:50 crc kubenswrapper[5004]: I1203 15:05:50.315543 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2gq7/must-gather-ppljs"] Dec 03 15:05:50 crc kubenswrapper[5004]: W1203 15:05:50.327356 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13dffc5_936b_4314_b0e0_69ef17717995.slice/crio-9af804806d18cd64571fef33b33343710b7b75ff8d3ea711154c19c736654bfb WatchSource:0}: Error finding container 9af804806d18cd64571fef33b33343710b7b75ff8d3ea711154c19c736654bfb: Status 404 returned error can't find the container with id 9af804806d18cd64571fef33b33343710b7b75ff8d3ea711154c19c736654bfb Dec 03 15:05:50 crc kubenswrapper[5004]: I1203 15:05:50.616927 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/must-gather-ppljs" event={"ID":"c13dffc5-936b-4314-b0e0-69ef17717995","Type":"ContainerStarted","Data":"9af804806d18cd64571fef33b33343710b7b75ff8d3ea711154c19c736654bfb"} Dec 03 15:05:52 crc kubenswrapper[5004]: I1203 15:05:52.824470 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:05:52 crc kubenswrapper[5004]: I1203 15:05:52.824840 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:05:55 crc kubenswrapper[5004]: I1203 15:05:55.658445 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/must-gather-ppljs" event={"ID":"c13dffc5-936b-4314-b0e0-69ef17717995","Type":"ContainerStarted","Data":"fc064887c483ce20a96bb6915150d3b9c153b172640276f421f3d5f79826e3fd"} Dec 03 15:05:55 crc kubenswrapper[5004]: I1203 15:05:55.658700 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/must-gather-ppljs" event={"ID":"c13dffc5-936b-4314-b0e0-69ef17717995","Type":"ContainerStarted","Data":"9184952f6de45b2d5eea9661737eddf366e425fd5b67275466cae37aadf2f57b"} Dec 03 15:05:55 crc kubenswrapper[5004]: I1203 15:05:55.685624 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2gq7/must-gather-ppljs" podStartSLOduration=2.205522147 podStartE2EDuration="6.685606386s" podCreationTimestamp="2025-12-03 15:05:49 +0000 UTC" firstStartedPulling="2025-12-03 15:05:50.330004958 +0000 UTC m=+3563.078975194" lastFinishedPulling="2025-12-03 15:05:54.810089197 +0000 UTC m=+3567.559059433" observedRunningTime="2025-12-03 15:05:55.677793922 +0000 UTC m=+3568.426764168" watchObservedRunningTime="2025-12-03 15:05:55.685606386 +0000 UTC m=+3568.434576632" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.531292 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-7zsh6"] Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.536834 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.539026 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p2gq7"/"default-dockercfg-s4ftc" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.702055 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.702399 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtwk\" (UniqueName: \"kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.804721 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.804773 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtwk\" (UniqueName: \"kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.805398 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.827964 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtwk\" (UniqueName: \"kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk\") pod \"crc-debug-7zsh6\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:58 crc kubenswrapper[5004]: I1203 15:05:58.884279 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:05:59 crc kubenswrapper[5004]: I1203 15:05:59.702609 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" event={"ID":"d276f240-4145-484b-839b-03c9f8a6c201","Type":"ContainerStarted","Data":"798cea466c774bb0d6f114c56076f3c193a5410edaa542d0558c244c165c2dc9"} Dec 03 15:06:10 crc kubenswrapper[5004]: I1203 15:06:10.813197 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" event={"ID":"d276f240-4145-484b-839b-03c9f8a6c201","Type":"ContainerStarted","Data":"8a79b81ede1c9e8ff27737f125af993f1101e950ed2d9e144f7ed913ddbdb74a"} Dec 03 15:06:10 crc kubenswrapper[5004]: I1203 15:06:10.830661 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" podStartSLOduration=2.122694042 podStartE2EDuration="12.830640273s" podCreationTimestamp="2025-12-03 15:05:58 +0000 UTC" firstStartedPulling="2025-12-03 15:05:58.92644612 +0000 UTC m=+3571.675416366" lastFinishedPulling="2025-12-03 15:06:09.634392361 +0000 UTC m=+3582.383362597" observedRunningTime="2025-12-03 15:06:10.828814831 +0000 UTC m=+3583.577785067" watchObservedRunningTime="2025-12-03 15:06:10.830640273 +0000 UTC m=+3583.579610519" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.691453 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.695820 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.703809 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.835685 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff9qd\" (UniqueName: \"kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.835942 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.836079 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.937541 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.937675 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff9qd\" (UniqueName: \"kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.937760 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.938351 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.938353 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:21 crc kubenswrapper[5004]: I1203 15:06:21.966605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff9qd\" (UniqueName: \"kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd\") pod \"redhat-marketplace-j68br\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:22 crc kubenswrapper[5004]: I1203 15:06:22.823960 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:06:22 crc kubenswrapper[5004]: I1203 15:06:22.824304 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:06:24 crc kubenswrapper[5004]: I1203 15:06:24.176886 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:24 crc kubenswrapper[5004]: I1203 15:06:24.653945 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:24 crc kubenswrapper[5004]: W1203 15:06:24.670022 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b353d2_4859_446f_9a53_2bcd49bb0f0d.slice/crio-b21a679f251a7279c97bdbbf0a7ef11a1fc81ed971c7d2d8520b9624fdd501cf WatchSource:0}: Error finding container b21a679f251a7279c97bdbbf0a7ef11a1fc81ed971c7d2d8520b9624fdd501cf: Status 404 returned error can't find the container with id b21a679f251a7279c97bdbbf0a7ef11a1fc81ed971c7d2d8520b9624fdd501cf Dec 03 15:06:24 crc kubenswrapper[5004]: I1203 15:06:24.932432 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerStarted","Data":"b21a679f251a7279c97bdbbf0a7ef11a1fc81ed971c7d2d8520b9624fdd501cf"} Dec 03 15:06:25 crc kubenswrapper[5004]: I1203 15:06:25.941622 5004 generic.go:334] "Generic (PLEG): container finished" podID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerID="2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166" exitCode=0 Dec 03 15:06:25 crc kubenswrapper[5004]: I1203 15:06:25.941703 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerDied","Data":"2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166"} Dec 03 15:06:26 crc kubenswrapper[5004]: I1203 15:06:26.953043 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerStarted","Data":"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24"} Dec 03 15:06:27 crc kubenswrapper[5004]: E1203 15:06:27.063598 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b353d2_4859_446f_9a53_2bcd49bb0f0d.slice/crio-74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24.scope\": RecentStats: unable to find data in memory cache]" Dec 03 15:06:27 crc kubenswrapper[5004]: I1203 15:06:27.962715 5004 generic.go:334] "Generic (PLEG): container finished" podID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerID="74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24" exitCode=0 Dec 03 15:06:27 crc kubenswrapper[5004]: I1203 15:06:27.962764 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerDied","Data":"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24"} Dec 03 15:06:27 crc kubenswrapper[5004]: I1203 15:06:27.964673 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:06:28 crc kubenswrapper[5004]: I1203 15:06:28.975512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerStarted","Data":"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e"} Dec 03 15:06:28 crc kubenswrapper[5004]: I1203 15:06:28.995456 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j68br" podStartSLOduration=5.427335826 podStartE2EDuration="7.995435946s" podCreationTimestamp="2025-12-03 15:06:21 +0000 UTC" firstStartedPulling="2025-12-03 15:06:25.944392143 +0000 UTC m=+3598.693362369" lastFinishedPulling="2025-12-03 15:06:28.512492253 +0000 UTC m=+3601.261462489" observedRunningTime="2025-12-03 15:06:28.994000455 +0000 UTC m=+3601.742970691" watchObservedRunningTime="2025-12-03 15:06:28.995435946 +0000 UTC m=+3601.744406192" Dec 03 15:06:34 crc kubenswrapper[5004]: I1203 15:06:34.177059 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:34 crc kubenswrapper[5004]: I1203 15:06:34.177604 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:34 crc kubenswrapper[5004]: I1203 15:06:34.228909 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:35 crc kubenswrapper[5004]: I1203 15:06:35.097403 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:35 crc kubenswrapper[5004]: I1203 15:06:35.164851 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.050042 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j68br" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="registry-server" containerID="cri-o://6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e" gracePeriod=2 Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.563741 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.743816 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff9qd\" (UniqueName: \"kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd\") pod \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.743981 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities\") pod \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.744066 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content\") pod \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\" (UID: \"57b353d2-4859-446f-9a53-2bcd49bb0f0d\") " Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.747365 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities" (OuterVolumeSpecName: "utilities") pod "57b353d2-4859-446f-9a53-2bcd49bb0f0d" (UID: "57b353d2-4859-446f-9a53-2bcd49bb0f0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.757221 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd" (OuterVolumeSpecName: "kube-api-access-ff9qd") pod "57b353d2-4859-446f-9a53-2bcd49bb0f0d" (UID: "57b353d2-4859-446f-9a53-2bcd49bb0f0d"). InnerVolumeSpecName "kube-api-access-ff9qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.792725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57b353d2-4859-446f-9a53-2bcd49bb0f0d" (UID: "57b353d2-4859-446f-9a53-2bcd49bb0f0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.845970 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff9qd\" (UniqueName: \"kubernetes.io/projected/57b353d2-4859-446f-9a53-2bcd49bb0f0d-kube-api-access-ff9qd\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.846003 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:37 crc kubenswrapper[5004]: I1203 15:06:37.846013 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b353d2-4859-446f-9a53-2bcd49bb0f0d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.062668 5004 generic.go:334] "Generic (PLEG): container finished" podID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerID="6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e" exitCode=0 Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.062710 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerDied","Data":"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e"} Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.062736 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j68br" event={"ID":"57b353d2-4859-446f-9a53-2bcd49bb0f0d","Type":"ContainerDied","Data":"b21a679f251a7279c97bdbbf0a7ef11a1fc81ed971c7d2d8520b9624fdd501cf"} Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.062767 5004 scope.go:117] "RemoveContainer" containerID="6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.062785 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j68br" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.094590 5004 scope.go:117] "RemoveContainer" containerID="74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.120910 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.128318 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j68br"] Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.128356 5004 scope.go:117] "RemoveContainer" containerID="2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.163036 5004 scope.go:117] "RemoveContainer" containerID="6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e" Dec 03 15:06:38 crc kubenswrapper[5004]: E1203 15:06:38.163418 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e\": container with ID starting with 6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e not found: ID does not exist" containerID="6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.163456 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e"} err="failed to get container status \"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e\": rpc error: code = NotFound desc = could not find container \"6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e\": container with ID starting with 6ba4778b534c3a75baf4fc0bbb0c374e0c3c1dc6724fabec331e6589e91e7c7e not found: ID does not exist" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.163485 5004 scope.go:117] "RemoveContainer" containerID="74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24" Dec 03 15:06:38 crc kubenswrapper[5004]: E1203 15:06:38.163750 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24\": container with ID starting with 74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24 not found: ID does not exist" containerID="74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.163776 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24"} err="failed to get container status \"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24\": rpc error: code = NotFound desc = could not find container \"74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24\": container with ID starting with 74e0aaed89525f50167ef822011fa396f5a10153383bb1d1a4ea72c09dfa7d24 not found: ID does not exist" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.163790 5004 scope.go:117] "RemoveContainer" containerID="2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166" Dec 03 15:06:38 crc kubenswrapper[5004]: E1203 15:06:38.164259 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166\": container with ID starting with 2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166 not found: ID does not exist" containerID="2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166" Dec 03 15:06:38 crc kubenswrapper[5004]: I1203 15:06:38.164288 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166"} err="failed to get container status \"2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166\": rpc error: code = NotFound desc = could not find container \"2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166\": container with ID starting with 2efe15dc693ce723d02cc63441f2f0648bc3f61adf7a822a4af377672970c166 not found: ID does not exist" Dec 03 15:06:39 crc kubenswrapper[5004]: I1203 15:06:39.625075 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" path="/var/lib/kubelet/pods/57b353d2-4859-446f-9a53-2bcd49bb0f0d/volumes" Dec 03 15:06:48 crc kubenswrapper[5004]: I1203 15:06:48.175698 5004 generic.go:334] "Generic (PLEG): container finished" podID="d276f240-4145-484b-839b-03c9f8a6c201" containerID="8a79b81ede1c9e8ff27737f125af993f1101e950ed2d9e144f7ed913ddbdb74a" exitCode=0 Dec 03 15:06:48 crc kubenswrapper[5004]: I1203 15:06:48.175792 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" event={"ID":"d276f240-4145-484b-839b-03c9f8a6c201","Type":"ContainerDied","Data":"8a79b81ede1c9e8ff27737f125af993f1101e950ed2d9e144f7ed913ddbdb74a"} Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.301595 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.358959 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-7zsh6"] Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.370247 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-7zsh6"] Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.484872 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djtwk\" (UniqueName: \"kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk\") pod \"d276f240-4145-484b-839b-03c9f8a6c201\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.485412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host\") pod \"d276f240-4145-484b-839b-03c9f8a6c201\" (UID: \"d276f240-4145-484b-839b-03c9f8a6c201\") " Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.485534 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host" (OuterVolumeSpecName: "host") pod "d276f240-4145-484b-839b-03c9f8a6c201" (UID: "d276f240-4145-484b-839b-03c9f8a6c201"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.486081 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d276f240-4145-484b-839b-03c9f8a6c201-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.492099 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk" (OuterVolumeSpecName: "kube-api-access-djtwk") pod "d276f240-4145-484b-839b-03c9f8a6c201" (UID: "d276f240-4145-484b-839b-03c9f8a6c201"). InnerVolumeSpecName "kube-api-access-djtwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.587548 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djtwk\" (UniqueName: \"kubernetes.io/projected/d276f240-4145-484b-839b-03c9f8a6c201-kube-api-access-djtwk\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:49 crc kubenswrapper[5004]: I1203 15:06:49.625413 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d276f240-4145-484b-839b-03c9f8a6c201" path="/var/lib/kubelet/pods/d276f240-4145-484b-839b-03c9f8a6c201/volumes" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.198479 5004 scope.go:117] "RemoveContainer" containerID="8a79b81ede1c9e8ff27737f125af993f1101e950ed2d9e144f7ed913ddbdb74a" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.198543 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-7zsh6" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.573211 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-sw95d"] Dec 03 15:06:50 crc kubenswrapper[5004]: E1203 15:06:50.573883 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="extract-content" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.573896 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="extract-content" Dec 03 15:06:50 crc kubenswrapper[5004]: E1203 15:06:50.573911 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="extract-utilities" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.573917 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="extract-utilities" Dec 03 15:06:50 crc kubenswrapper[5004]: E1203 15:06:50.573951 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d276f240-4145-484b-839b-03c9f8a6c201" containerName="container-00" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.573957 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d276f240-4145-484b-839b-03c9f8a6c201" containerName="container-00" Dec 03 15:06:50 crc kubenswrapper[5004]: E1203 15:06:50.573964 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="registry-server" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.573969 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="registry-server" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.574134 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d276f240-4145-484b-839b-03c9f8a6c201" containerName="container-00" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.574149 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b353d2-4859-446f-9a53-2bcd49bb0f0d" containerName="registry-server" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.574741 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.583420 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p2gq7"/"default-dockercfg-s4ftc" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.603725 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xz8p\" (UniqueName: \"kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.603840 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.705183 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xz8p\" (UniqueName: \"kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.705280 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.705524 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.723338 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xz8p\" (UniqueName: \"kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p\") pod \"crc-debug-sw95d\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: I1203 15:06:50.894020 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:50 crc kubenswrapper[5004]: W1203 15:06:50.928083 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d81d3d7_4491_4196_ad13_152748b3895e.slice/crio-83088c1eb73ad4f40626914f451af32cd5f23c0ae2fa506a9fa5caaef86be6d8 WatchSource:0}: Error finding container 83088c1eb73ad4f40626914f451af32cd5f23c0ae2fa506a9fa5caaef86be6d8: Status 404 returned error can't find the container with id 83088c1eb73ad4f40626914f451af32cd5f23c0ae2fa506a9fa5caaef86be6d8 Dec 03 15:06:51 crc kubenswrapper[5004]: I1203 15:06:51.208893 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" event={"ID":"6d81d3d7-4491-4196-ad13-152748b3895e","Type":"ContainerStarted","Data":"abc3724b0559ef8b38c1dbe228208ba2d3a26a47e12160a657c4e6deeaaf2393"} Dec 03 15:06:51 crc kubenswrapper[5004]: I1203 15:06:51.208950 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" event={"ID":"6d81d3d7-4491-4196-ad13-152748b3895e","Type":"ContainerStarted","Data":"83088c1eb73ad4f40626914f451af32cd5f23c0ae2fa506a9fa5caaef86be6d8"} Dec 03 15:06:51 crc kubenswrapper[5004]: I1203 15:06:51.223665 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" podStartSLOduration=1.223650606 podStartE2EDuration="1.223650606s" podCreationTimestamp="2025-12-03 15:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:06:51.220824035 +0000 UTC m=+3623.969794281" watchObservedRunningTime="2025-12-03 15:06:51.223650606 +0000 UTC m=+3623.972620842" Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.220055 5004 generic.go:334] "Generic (PLEG): container finished" podID="6d81d3d7-4491-4196-ad13-152748b3895e" containerID="abc3724b0559ef8b38c1dbe228208ba2d3a26a47e12160a657c4e6deeaaf2393" exitCode=0 Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.220137 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" event={"ID":"6d81d3d7-4491-4196-ad13-152748b3895e","Type":"ContainerDied","Data":"abc3724b0559ef8b38c1dbe228208ba2d3a26a47e12160a657c4e6deeaaf2393"} Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.824337 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.824436 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.824509 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.825477 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:06:52 crc kubenswrapper[5004]: I1203 15:06:52.825578 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06" gracePeriod=600 Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.233036 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06" exitCode=0 Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.233251 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06"} Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.233284 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c"} Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.233305 5004 scope.go:117] "RemoveContainer" containerID="ec70ee2aacf6e25e9b486fa752aadd9e2478fc6b1ca00128dffd011d75623419" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.363199 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.414680 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-sw95d"] Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.422907 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-sw95d"] Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.454961 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host\") pod \"6d81d3d7-4491-4196-ad13-152748b3895e\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.455058 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host" (OuterVolumeSpecName: "host") pod "6d81d3d7-4491-4196-ad13-152748b3895e" (UID: "6d81d3d7-4491-4196-ad13-152748b3895e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.455180 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xz8p\" (UniqueName: \"kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p\") pod \"6d81d3d7-4491-4196-ad13-152748b3895e\" (UID: \"6d81d3d7-4491-4196-ad13-152748b3895e\") " Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.455680 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d81d3d7-4491-4196-ad13-152748b3895e-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.461391 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p" (OuterVolumeSpecName: "kube-api-access-6xz8p") pod "6d81d3d7-4491-4196-ad13-152748b3895e" (UID: "6d81d3d7-4491-4196-ad13-152748b3895e"). InnerVolumeSpecName "kube-api-access-6xz8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.557345 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xz8p\" (UniqueName: \"kubernetes.io/projected/6d81d3d7-4491-4196-ad13-152748b3895e-kube-api-access-6xz8p\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:53 crc kubenswrapper[5004]: I1203 15:06:53.623303 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d81d3d7-4491-4196-ad13-152748b3895e" path="/var/lib/kubelet/pods/6d81d3d7-4491-4196-ad13-152748b3895e/volumes" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.260309 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-sw95d" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.260361 5004 scope.go:117] "RemoveContainer" containerID="abc3724b0559ef8b38c1dbe228208ba2d3a26a47e12160a657c4e6deeaaf2393" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.604340 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-kjh7s"] Dec 03 15:06:54 crc kubenswrapper[5004]: E1203 15:06:54.605484 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d81d3d7-4491-4196-ad13-152748b3895e" containerName="container-00" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.605528 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d81d3d7-4491-4196-ad13-152748b3895e" containerName="container-00" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.605841 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d81d3d7-4491-4196-ad13-152748b3895e" containerName="container-00" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.606911 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.613016 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p2gq7"/"default-dockercfg-s4ftc" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.783701 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.784000 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpgk\" (UniqueName: \"kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.885410 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpgk\" (UniqueName: \"kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.885526 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.885666 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.910493 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpgk\" (UniqueName: \"kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk\") pod \"crc-debug-kjh7s\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: I1203 15:06:54.934447 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:54 crc kubenswrapper[5004]: W1203 15:06:54.987083 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76a8ac34_f8d9_4547_a922_1796568b0dc8.slice/crio-a8176c55976c5a38f19d9947611bd75bb875b1da9b3705fb5e77c02f12aae70e WatchSource:0}: Error finding container a8176c55976c5a38f19d9947611bd75bb875b1da9b3705fb5e77c02f12aae70e: Status 404 returned error can't find the container with id a8176c55976c5a38f19d9947611bd75bb875b1da9b3705fb5e77c02f12aae70e Dec 03 15:06:55 crc kubenswrapper[5004]: I1203 15:06:55.276573 5004 generic.go:334] "Generic (PLEG): container finished" podID="76a8ac34-f8d9-4547-a922-1796568b0dc8" containerID="376d5b52831335b4563b7ef23c862312f40fd9c33f421697dc4e689f90684dcf" exitCode=0 Dec 03 15:06:55 crc kubenswrapper[5004]: I1203 15:06:55.276622 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" event={"ID":"76a8ac34-f8d9-4547-a922-1796568b0dc8","Type":"ContainerDied","Data":"376d5b52831335b4563b7ef23c862312f40fd9c33f421697dc4e689f90684dcf"} Dec 03 15:06:55 crc kubenswrapper[5004]: I1203 15:06:55.276650 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" event={"ID":"76a8ac34-f8d9-4547-a922-1796568b0dc8","Type":"ContainerStarted","Data":"a8176c55976c5a38f19d9947611bd75bb875b1da9b3705fb5e77c02f12aae70e"} Dec 03 15:06:55 crc kubenswrapper[5004]: I1203 15:06:55.324468 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-kjh7s"] Dec 03 15:06:55 crc kubenswrapper[5004]: I1203 15:06:55.334033 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p2gq7/crc-debug-kjh7s"] Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.412260 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.514090 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxpgk\" (UniqueName: \"kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk\") pod \"76a8ac34-f8d9-4547-a922-1796568b0dc8\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.514649 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host\") pod \"76a8ac34-f8d9-4547-a922-1796568b0dc8\" (UID: \"76a8ac34-f8d9-4547-a922-1796568b0dc8\") " Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.515193 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host" (OuterVolumeSpecName: "host") pod "76a8ac34-f8d9-4547-a922-1796568b0dc8" (UID: "76a8ac34-f8d9-4547-a922-1796568b0dc8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.538250 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk" (OuterVolumeSpecName: "kube-api-access-vxpgk") pod "76a8ac34-f8d9-4547-a922-1796568b0dc8" (UID: "76a8ac34-f8d9-4547-a922-1796568b0dc8"). InnerVolumeSpecName "kube-api-access-vxpgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.616217 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxpgk\" (UniqueName: \"kubernetes.io/projected/76a8ac34-f8d9-4547-a922-1796568b0dc8-kube-api-access-vxpgk\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:56 crc kubenswrapper[5004]: I1203 15:06:56.616255 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76a8ac34-f8d9-4547-a922-1796568b0dc8-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:06:57 crc kubenswrapper[5004]: I1203 15:06:57.293457 5004 scope.go:117] "RemoveContainer" containerID="376d5b52831335b4563b7ef23c862312f40fd9c33f421697dc4e689f90684dcf" Dec 03 15:06:57 crc kubenswrapper[5004]: I1203 15:06:57.293469 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/crc-debug-kjh7s" Dec 03 15:06:57 crc kubenswrapper[5004]: I1203 15:06:57.634812 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a8ac34-f8d9-4547-a922-1796568b0dc8" path="/var/lib/kubelet/pods/76a8ac34-f8d9-4547-a922-1796568b0dc8/volumes" Dec 03 15:07:09 crc kubenswrapper[5004]: I1203 15:07:09.981032 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f76744786-jfgf7_939c0a06-65e2-45ea-b58d-7d4cc431207b/barbican-api/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.125565 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f76744786-jfgf7_939c0a06-65e2-45ea-b58d-7d4cc431207b/barbican-api-log/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.164824 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c799f8fcd-gg559_c4501cb8-8287-4e9d-83b2-858fcb7c431c/barbican-keystone-listener/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.243930 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c799f8fcd-gg559_c4501cb8-8287-4e9d-83b2-858fcb7c431c/barbican-keystone-listener-log/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.376947 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56979bf587-swwll_033d65ef-e917-445c-9c56-cffb8b328dbf/barbican-worker/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.422811 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56979bf587-swwll_033d65ef-e917-445c-9c56-cffb8b328dbf/barbican-worker-log/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.577449 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4_7a8c5468-695c-4238-9cae-3b010f6987ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.675277 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/ceilometer-central-agent/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.707147 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/ceilometer-notification-agent/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.796882 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/proxy-httpd/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.908597 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/sg-core/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.928405 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e715befa-4ae4-4466-beb4-ee8939e3bb86/cinder-api/0.log" Dec 03 15:07:10 crc kubenswrapper[5004]: I1203 15:07:10.999005 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e715befa-4ae4-4466-beb4-ee8939e3bb86/cinder-api-log/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.146916 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7502099c-9fa6-4071-8ce6-4471b9f44f78/probe/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.166002 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7502099c-9fa6-4071-8ce6-4471b9f44f78/cinder-scheduler/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.330295 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv_632aa0c1-b525-45af-8254-2f0f0dc57c43/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.423090 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sqfng_652a6191-a7f2-47a8-9f26-48137e58ce1b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.517038 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/init/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.688000 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/init/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.744936 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt_faf69ec7-959a-404b-9bae-24bc3c528c28/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.769459 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/dnsmasq-dns/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.927176 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f95292-1d71-478d-ab12-138e2b34bd3f/glance-log/0.log" Dec 03 15:07:11 crc kubenswrapper[5004]: I1203 15:07:11.954803 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f95292-1d71-478d-ab12-138e2b34bd3f/glance-httpd/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.071362 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3959efd9-7c4e-43b5-b73a-9b05ec3fb59c/glance-httpd/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.146952 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3959efd9-7c4e-43b5-b73a-9b05ec3fb59c/glance-log/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.289085 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79df97d86b-4dr9p_e0f1c734-5c6e-4f15-8f11-1e3c1da2d880/horizon/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.447393 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-wx45m_4f32dcbb-677a-48e6-9c25-eaec1655a155/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.574897 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79df97d86b-4dr9p_e0f1c734-5c6e-4f15-8f11-1e3c1da2d880/horizon-log/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.652175 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9ggz8_5de0359a-b8f8-4989-8739-ee565cd596fe/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.827705 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412901-5c859_ad605cec-0786-4e4e-a1f2-9626a34e39c8/keystone-cron/0.log" Dec 03 15:07:12 crc kubenswrapper[5004]: I1203 15:07:12.901247 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-65f67fcd5d-5p75z_7efdab5a-a074-4ce4-bcc0-b2b8481b886c/keystone-api/0.log" Dec 03 15:07:13 crc kubenswrapper[5004]: I1203 15:07:13.052233 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_70b42ea8-681a-44cb-a494-2093b925d015/kube-state-metrics/0.log" Dec 03 15:07:13 crc kubenswrapper[5004]: I1203 15:07:13.086709 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-22sj5_3e3f3f7f-8810-4c7f-b3b0-975700874959/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:13 crc kubenswrapper[5004]: I1203 15:07:13.459680 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66456bfc4f-v6lrf_5c21b585-fe01-4f87-9a60-1df17f266659/neutron-api/0.log" Dec 03 15:07:13 crc kubenswrapper[5004]: I1203 15:07:13.599397 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66456bfc4f-v6lrf_5c21b585-fe01-4f87-9a60-1df17f266659/neutron-httpd/0.log" Dec 03 15:07:13 crc kubenswrapper[5004]: I1203 15:07:13.712278 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c_752f8ea2-1e21-4ff4-aac9-4f1a5f662561/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.112059 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_489bba73-88c3-42e0-ad06-ee95a6073263/nova-api-log/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.163804 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d73689af-1cf5-4846-8e52-c34bce039ca7/nova-cell0-conductor-conductor/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.374950 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_489bba73-88c3-42e0-ad06-ee95a6073263/nova-api-api/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.450817 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_234c4127-5836-4628-a426-2644c4df71a1/nova-cell1-conductor-conductor/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.490125 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_182237d5-f265-4577-8b9a-51f4e2a64a6a/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.621582 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ffvrz_32a75e28-35af-4a42-ae5c-ac1a24ba78ee/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:14 crc kubenswrapper[5004]: I1203 15:07:14.786059 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a010cac3-4f37-4ffd-8627-5329e566d91a/nova-metadata-log/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.206173 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/mysql-bootstrap/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.232444 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c4e5958b-4572-4965-8948-89fc51a2c486/nova-scheduler-scheduler/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.413359 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/mysql-bootstrap/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.472486 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/galera/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.602814 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/mysql-bootstrap/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.849062 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/mysql-bootstrap/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.856761 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/galera/0.log" Dec 03 15:07:15 crc kubenswrapper[5004]: I1203 15:07:15.955617 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a010cac3-4f37-4ffd-8627-5329e566d91a/nova-metadata-metadata/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.059455 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_32ad9006-53eb-4a4f-8ec0-8c287231374e/openstackclient/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.074766 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-x4sdg_01b52e65-ccad-48ac-91d0-b5b9fb3905cd/openstack-network-exporter/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.292663 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server-init/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.421844 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server-init/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.453382 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.491330 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovs-vswitchd/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.685243 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zdz2r_9cf66a90-3f7d-4170-8dab-9ff58ba576a3/ovn-controller/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.727487 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zkt82_122d652b-2c6a-4aa2-9303-e844922d4620/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.915637 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9af36c08-ab5c-4a97-88d3-a7ef2f032faf/openstack-network-exporter/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.937334 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9af36c08-ab5c-4a97-88d3-a7ef2f032faf/ovn-northd/0.log" Dec 03 15:07:16 crc kubenswrapper[5004]: I1203 15:07:16.986396 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_116764ff-36a4-444f-8051-e93b94a548fd/openstack-network-exporter/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.145353 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_116764ff-36a4-444f-8051-e93b94a548fd/ovsdbserver-nb/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.170991 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4d5df9d0-5ee6-4981-86b4-e90415206ceb/openstack-network-exporter/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.246212 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4d5df9d0-5ee6-4981-86b4-e90415206ceb/ovsdbserver-sb/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.439200 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ccd4cc976-4jrqc_a6f2bf21-eade-495e-99bb-4d12b3c46c3b/placement-api/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.529169 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ccd4cc976-4jrqc_a6f2bf21-eade-495e-99bb-4d12b3c46c3b/placement-log/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.627232 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/setup-container/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.787031 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/setup-container/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.795991 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/rabbitmq/0.log" Dec 03 15:07:17 crc kubenswrapper[5004]: I1203 15:07:17.857277 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/setup-container/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.042416 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/setup-container/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.104106 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/rabbitmq/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.129385 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68_0c169631-8cd9-45a0-b295-026cd99d6e41/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.336462 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4skhq_5c867273-ae64-48f8-85f1-4eb5624b9dea/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.475285 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh_fada131d-446d-4819-b137-48910402240f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.580671 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6j48p_35e88acc-36ab-41a3-ab34-a04a3a4234de/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.771491 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4qr7k_48017895-b32f-4afa-a7bf-e7e41c29d256/ssh-known-hosts-edpm-deployment/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.847581 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b9d87dc5f-trzlj_37c9311f-7b12-474a-ba76-c7c534f55e55/proxy-server/0.log" Dec 03 15:07:18 crc kubenswrapper[5004]: I1203 15:07:18.975768 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b9d87dc5f-trzlj_37c9311f-7b12-474a-ba76-c7c534f55e55/proxy-httpd/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.046487 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7xgl7_f9137320-4b52-422f-a96b-34c555c55aa6/swift-ring-rebalance/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.300415 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-reaper/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.310270 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-replicator/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.340329 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-auditor/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.456662 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-server/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.559915 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-replicator/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.565096 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-auditor/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.591715 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-server/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.667438 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-updater/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.797802 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-expirer/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.811820 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-auditor/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.825035 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-replicator/0.log" Dec 03 15:07:19 crc kubenswrapper[5004]: I1203 15:07:19.852819 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-server/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.003303 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-updater/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.076380 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/rsync/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.080893 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/swift-recon-cron/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.240097 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xx47r_cf32991a-bf4f-4ce6-9d01-3b75e2108b9f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.274141 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_be771b30-f62b-4d18-977a-2c0d6ecca56a/tempest-tests-tempest-tests-runner/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.492622 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8fbccb0-338d-4e17-915a-25d07c3491c9/test-operator-logs-container/0.log" Dec 03 15:07:20 crc kubenswrapper[5004]: I1203 15:07:20.555072 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kftm2_8f5d5c71-22c1-4bd4-a95d-8865928a48c3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:07:28 crc kubenswrapper[5004]: I1203 15:07:28.291303 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3bfe51e0-df6b-446f-9647-d9165f3cdead/memcached/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.222824 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.364589 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.407315 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.442182 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.625364 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.628443 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.638902 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/extract/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.816243 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-l4b9j_34247b31-24ab-4386-8bf1-f0bfa7df6f00/kube-rbac-proxy/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.872746 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-l4b9j_34247b31-24ab-4386-8bf1-f0bfa7df6f00/manager/0.log" Dec 03 15:07:45 crc kubenswrapper[5004]: I1203 15:07:45.904087 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qffz8_04d75592-adf5-42b6-a02e-0074674b393d/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.047760 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qffz8_04d75592-adf5-42b6-a02e-0074674b393d/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.066551 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-97hkh_e44a1a8b-fd83-478f-9095-73e2f82ed81c/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.143975 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-97hkh_e44a1a8b-fd83-478f-9095-73e2f82ed81c/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.257146 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4lhqd_271911f5-3a7c-448b-976d-268c5b19edc1/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.309594 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4lhqd_271911f5-3a7c-448b-976d-268c5b19edc1/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.425020 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-thtjj_9be3a985-7677-4334-b270-386feb954a5c/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.460147 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-thtjj_9be3a985-7677-4334-b270-386feb954a5c/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.551504 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lprd2_f10a5021-1caf-47ba-8dce-51021a641f4c/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.663189 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lprd2_f10a5021-1caf-47ba-8dce-51021a641f4c/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.738716 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pv9cw_3741a6af-989d-47ac-a6ee-a6443a4f2883/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.862449 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-6fqgd_419e5e47-1866-473a-a668-2fee54cb76ce/kube-rbac-proxy/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.908508 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pv9cw_3741a6af-989d-47ac-a6ee-a6443a4f2883/manager/0.log" Dec 03 15:07:46 crc kubenswrapper[5004]: I1203 15:07:46.980231 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-6fqgd_419e5e47-1866-473a-a668-2fee54cb76ce/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.105728 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-nwtth_3c18cd5e-8d20-4a2b-a62c-d141de1fc38a/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.186812 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-nwtth_3c18cd5e-8d20-4a2b-a62c-d141de1fc38a/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.283837 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4ck5g_b70998ef-a4ea-49a9-922d-d7ad70346932/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.290256 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4ck5g_b70998ef-a4ea-49a9-922d-d7ad70346932/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.410933 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-kffh7_a38ab130-8698-49c3-bf30-355f88bcdc45/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.492348 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-kffh7_a38ab130-8698-49c3-bf30-355f88bcdc45/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.551357 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4b4kl_f35c5faa-53cc-4829-91a0-1c422eae75f6/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.651286 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4b4kl_f35c5faa-53cc-4829-91a0-1c422eae75f6/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.682710 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-st4w4_dd7dec16-458d-46f6-9ee6-b0db6551792a/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.847666 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gndmp_a1d5cb2a-85a6-4ff0-a9cf-519397479d2c/kube-rbac-proxy/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.875922 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-st4w4_dd7dec16-458d-46f6-9ee6-b0db6551792a/manager/0.log" Dec 03 15:07:47 crc kubenswrapper[5004]: I1203 15:07:47.924458 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gndmp_a1d5cb2a-85a6-4ff0-a9cf-519397479d2c/manager/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.052610 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd46zspw_7dbdc2c5-5e0c-4315-b836-1acacf93df2d/kube-rbac-proxy/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.070405 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd46zspw_7dbdc2c5-5e0c-4315-b836-1acacf93df2d/manager/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.405984 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d6f666fbc-smswg_1d0cab62-1f81-48c0-a3b3-3a774fcd7b18/operator/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.530575 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-x4d5l_dc67c749-644a-416d-8f75-ebd340795204/registry-server/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.602677 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-td552_2aab9a50-58d3-4eba-8589-c009d3b2b604/kube-rbac-proxy/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.849911 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-5z95c_90f6b1a6-2cd1-4649-b794-e00f64cd80cb/kube-rbac-proxy/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.929680 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-5z95c_90f6b1a6-2cd1-4649-b794-e00f64cd80cb/manager/0.log" Dec 03 15:07:48 crc kubenswrapper[5004]: I1203 15:07:48.960893 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-td552_2aab9a50-58d3-4eba-8589-c009d3b2b604/manager/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.086847 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gzd52_65687c7c-1b6d-485f-b99c-41706846c7a7/operator/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.177030 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8xwrm_9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b/kube-rbac-proxy/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.318141 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8xwrm_9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b/manager/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.408466 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8bnn2_206c7f05-3575-400e-a37b-ba608f159fc5/kube-rbac-proxy/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.482573 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8bnn2_206c7f05-3575-400e-a37b-ba608f159fc5/manager/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.534741 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-92pmx_bf9d689f-bfab-4b05-9b08-d855836a7846/kube-rbac-proxy/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.615838 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-92pmx_bf9d689f-bfab-4b05-9b08-d855836a7846/manager/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.620005 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-79c58f7d4-4qmpw_a3fd1093-3e64-4558-9314-355dbf1c8a8c/manager/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.726170 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rlxc5_298e9f66-a005-42bd-b2f6-4653a88e0177/kube-rbac-proxy/0.log" Dec 03 15:07:49 crc kubenswrapper[5004]: I1203 15:07:49.774087 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rlxc5_298e9f66-a005-42bd-b2f6-4653a88e0177/manager/0.log" Dec 03 15:08:08 crc kubenswrapper[5004]: I1203 15:08:08.332407 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-swmpz_f986649e-61c8-4c67-beb3-edc5dc4e4fd9/control-plane-machine-set-operator/0.log" Dec 03 15:08:08 crc kubenswrapper[5004]: I1203 15:08:08.535060 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mdbfw_a887d450-ffa8-4b30-98db-2e223c46b134/machine-api-operator/0.log" Dec 03 15:08:08 crc kubenswrapper[5004]: I1203 15:08:08.585143 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mdbfw_a887d450-ffa8-4b30-98db-2e223c46b134/kube-rbac-proxy/0.log" Dec 03 15:08:20 crc kubenswrapper[5004]: I1203 15:08:20.586360 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4rv6q_0399bdc2-ceca-49e1-a00b-a8685a860ebe/cert-manager-controller/0.log" Dec 03 15:08:20 crc kubenswrapper[5004]: I1203 15:08:20.687507 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hjmk5_6b108801-1198-420f-ab57-dea765daf047/cert-manager-cainjector/0.log" Dec 03 15:08:20 crc kubenswrapper[5004]: I1203 15:08:20.782659 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-swvh8_a64714f4-8d4b-4101-bf1c-d953cddb3f08/cert-manager-webhook/0.log" Dec 03 15:08:32 crc kubenswrapper[5004]: I1203 15:08:32.854297 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pgnkx_3c3279d4-50fa-454f-993b-ce1d1aa33140/nmstate-console-plugin/0.log" Dec 03 15:08:33 crc kubenswrapper[5004]: I1203 15:08:33.024220 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mvxhn_4efdf8bb-b98f-4afa-a605-0bb57c93b999/kube-rbac-proxy/0.log" Dec 03 15:08:33 crc kubenswrapper[5004]: I1203 15:08:33.033435 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hdbk9_ba4eef2f-7208-44fd-b116-6f394cf2c7e2/nmstate-handler/0.log" Dec 03 15:08:33 crc kubenswrapper[5004]: I1203 15:08:33.080324 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mvxhn_4efdf8bb-b98f-4afa-a605-0bb57c93b999/nmstate-metrics/0.log" Dec 03 15:08:33 crc kubenswrapper[5004]: I1203 15:08:33.211625 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-pf52k_5a1faf91-6fda-4e62-801d-bb1624d95274/nmstate-operator/0.log" Dec 03 15:08:33 crc kubenswrapper[5004]: I1203 15:08:33.287548 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-xbgh5_c005e57a-6449-4c48-a81c-deda46fc3d02/nmstate-webhook/0.log" Dec 03 15:08:46 crc kubenswrapper[5004]: I1203 15:08:46.990638 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vfwkq_0af74282-be81-45a2-966a-4dcb279d7c6a/kube-rbac-proxy/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.109324 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vfwkq_0af74282-be81-45a2-966a-4dcb279d7c6a/controller/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.225401 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.352682 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.384938 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.404469 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.438566 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.649037 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.656704 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.667280 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:08:47 crc kubenswrapper[5004]: I1203 15:08:47.668720 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.012689 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.029010 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/controller/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.035526 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.057532 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.207332 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/frr-metrics/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.209287 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/kube-rbac-proxy/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.271163 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/kube-rbac-proxy-frr/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.477904 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-jksx8_b1856a9d-f833-48a2-941b-8c9fd3f06416/frr-k8s-webhook-server/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.487213 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/reloader/0.log" Dec 03 15:08:48 crc kubenswrapper[5004]: I1203 15:08:48.761128 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b7864bd46-w9w6m_6aa91ebc-2da0-4d5e-9847-d5f2758e72e5/manager/0.log" Dec 03 15:08:49 crc kubenswrapper[5004]: I1203 15:08:49.452011 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/frr/0.log" Dec 03 15:08:49 crc kubenswrapper[5004]: I1203 15:08:49.484131 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8c7f47999-twv92_8d864da6-31ee-490f-b4e8-568f95a96ff0/webhook-server/0.log" Dec 03 15:08:49 crc kubenswrapper[5004]: I1203 15:08:49.509616 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6kv8g_27227413-e203-4218-942d-35c1493b7015/kube-rbac-proxy/0.log" Dec 03 15:08:49 crc kubenswrapper[5004]: I1203 15:08:49.919146 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6kv8g_27227413-e203-4218-942d-35c1493b7015/speaker/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.305323 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.476390 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.489867 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.493101 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.687277 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.751066 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.762688 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/extract/0.log" Dec 03 15:09:03 crc kubenswrapper[5004]: I1203 15:09:03.868291 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.078054 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.082824 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.088155 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.249390 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.255898 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.281422 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/extract/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.419213 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.603565 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.613203 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.646701 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.798297 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.812025 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:09:04 crc kubenswrapper[5004]: I1203 15:09:04.995550 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.275774 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.276768 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.298445 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.364433 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/registry-server/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.503711 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.504515 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.738572 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x2qkd_cca7b643-a679-4b89-b42d-a18c552a737b/marketplace-operator/0.log" Dec 03 15:09:05 crc kubenswrapper[5004]: I1203 15:09:05.962765 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.091741 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/registry-server/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.125036 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.163168 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.194765 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.312244 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.333190 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.515918 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/registry-server/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.574027 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.776169 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.776237 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.781940 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:09:06 crc kubenswrapper[5004]: I1203 15:09:06.975172 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:09:07 crc kubenswrapper[5004]: I1203 15:09:07.036841 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:09:07 crc kubenswrapper[5004]: I1203 15:09:07.405846 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/registry-server/0.log" Dec 03 15:09:22 crc kubenswrapper[5004]: I1203 15:09:22.823786 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:09:22 crc kubenswrapper[5004]: I1203 15:09:22.824354 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:09:41 crc kubenswrapper[5004]: E1203 15:09:41.608966 5004 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.38:38534->38.102.83.38:41371: read tcp 38.102.83.38:38534->38.102.83.38:41371: read: connection reset by peer Dec 03 15:09:41 crc kubenswrapper[5004]: E1203 15:09:41.609564 5004 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:38534->38.102.83.38:41371: write tcp 38.102.83.38:38534->38.102.83.38:41371: write: broken pipe Dec 03 15:09:52 crc kubenswrapper[5004]: I1203 15:09:52.824350 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:09:52 crc kubenswrapper[5004]: I1203 15:09:52.824926 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:10:22 crc kubenswrapper[5004]: I1203 15:10:22.825082 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:10:22 crc kubenswrapper[5004]: I1203 15:10:22.825748 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:10:22 crc kubenswrapper[5004]: I1203 15:10:22.825819 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 15:10:22 crc kubenswrapper[5004]: I1203 15:10:22.827159 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:10:22 crc kubenswrapper[5004]: I1203 15:10:22.827305 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" gracePeriod=600 Dec 03 15:10:23 crc kubenswrapper[5004]: I1203 15:10:23.195768 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" exitCode=0 Dec 03 15:10:23 crc kubenswrapper[5004]: I1203 15:10:23.195814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c"} Dec 03 15:10:23 crc kubenswrapper[5004]: I1203 15:10:23.195934 5004 scope.go:117] "RemoveContainer" containerID="1b896cd2ff793c19202be6bfd582f4ab0e4bc0ed7c72fe0a5e6b9c6228ba6e06" Dec 03 15:10:23 crc kubenswrapper[5004]: E1203 15:10:23.221074 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6cf6ea_c7f7_44bb_b1fa_9e8d5f1d9c94.slice/crio-conmon-aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 15:10:23 crc kubenswrapper[5004]: E1203 15:10:23.460202 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:10:24 crc kubenswrapper[5004]: I1203 15:10:24.207239 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:10:24 crc kubenswrapper[5004]: E1203 15:10:24.207616 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:10:37 crc kubenswrapper[5004]: I1203 15:10:37.613453 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:10:37 crc kubenswrapper[5004]: E1203 15:10:37.614249 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:10:49 crc kubenswrapper[5004]: I1203 15:10:49.613562 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:10:49 crc kubenswrapper[5004]: E1203 15:10:49.614831 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:10:50 crc kubenswrapper[5004]: I1203 15:10:50.500280 5004 generic.go:334] "Generic (PLEG): container finished" podID="c13dffc5-936b-4314-b0e0-69ef17717995" containerID="9184952f6de45b2d5eea9661737eddf366e425fd5b67275466cae37aadf2f57b" exitCode=0 Dec 03 15:10:50 crc kubenswrapper[5004]: I1203 15:10:50.500391 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2gq7/must-gather-ppljs" event={"ID":"c13dffc5-936b-4314-b0e0-69ef17717995","Type":"ContainerDied","Data":"9184952f6de45b2d5eea9661737eddf366e425fd5b67275466cae37aadf2f57b"} Dec 03 15:10:50 crc kubenswrapper[5004]: I1203 15:10:50.501414 5004 scope.go:117] "RemoveContainer" containerID="9184952f6de45b2d5eea9661737eddf366e425fd5b67275466cae37aadf2f57b" Dec 03 15:10:51 crc kubenswrapper[5004]: I1203 15:10:51.277593 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p2gq7_must-gather-ppljs_c13dffc5-936b-4314-b0e0-69ef17717995/gather/0.log" Dec 03 15:10:59 crc kubenswrapper[5004]: I1203 15:10:59.184664 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p2gq7/must-gather-ppljs"] Dec 03 15:10:59 crc kubenswrapper[5004]: I1203 15:10:59.186700 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-p2gq7/must-gather-ppljs" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="copy" containerID="cri-o://fc064887c483ce20a96bb6915150d3b9c153b172640276f421f3d5f79826e3fd" gracePeriod=2 Dec 03 15:10:59 crc kubenswrapper[5004]: I1203 15:10:59.194876 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p2gq7/must-gather-ppljs"] Dec 03 15:10:59 crc kubenswrapper[5004]: I1203 15:10:59.597882 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p2gq7_must-gather-ppljs_c13dffc5-936b-4314-b0e0-69ef17717995/copy/0.log" Dec 03 15:10:59 crc kubenswrapper[5004]: I1203 15:10:59.598218 5004 generic.go:334] "Generic (PLEG): container finished" podID="c13dffc5-936b-4314-b0e0-69ef17717995" containerID="fc064887c483ce20a96bb6915150d3b9c153b172640276f421f3d5f79826e3fd" exitCode=143 Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.057781 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p2gq7_must-gather-ppljs_c13dffc5-936b-4314-b0e0-69ef17717995/copy/0.log" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.058428 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.221923 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hc9l\" (UniqueName: \"kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l\") pod \"c13dffc5-936b-4314-b0e0-69ef17717995\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.222700 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output\") pod \"c13dffc5-936b-4314-b0e0-69ef17717995\" (UID: \"c13dffc5-936b-4314-b0e0-69ef17717995\") " Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.235078 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l" (OuterVolumeSpecName: "kube-api-access-6hc9l") pod "c13dffc5-936b-4314-b0e0-69ef17717995" (UID: "c13dffc5-936b-4314-b0e0-69ef17717995"). InnerVolumeSpecName "kube-api-access-6hc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.324755 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hc9l\" (UniqueName: \"kubernetes.io/projected/c13dffc5-936b-4314-b0e0-69ef17717995-kube-api-access-6hc9l\") on node \"crc\" DevicePath \"\"" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.359043 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c13dffc5-936b-4314-b0e0-69ef17717995" (UID: "c13dffc5-936b-4314-b0e0-69ef17717995"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.426281 5004 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c13dffc5-936b-4314-b0e0-69ef17717995-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.606634 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p2gq7_must-gather-ppljs_c13dffc5-936b-4314-b0e0-69ef17717995/copy/0.log" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.607059 5004 scope.go:117] "RemoveContainer" containerID="fc064887c483ce20a96bb6915150d3b9c153b172640276f421f3d5f79826e3fd" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.607097 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2gq7/must-gather-ppljs" Dec 03 15:11:00 crc kubenswrapper[5004]: I1203 15:11:00.628595 5004 scope.go:117] "RemoveContainer" containerID="9184952f6de45b2d5eea9661737eddf366e425fd5b67275466cae37aadf2f57b" Dec 03 15:11:01 crc kubenswrapper[5004]: I1203 15:11:01.625971 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" path="/var/lib/kubelet/pods/c13dffc5-936b-4314-b0e0-69ef17717995/volumes" Dec 03 15:11:04 crc kubenswrapper[5004]: I1203 15:11:04.612547 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:11:04 crc kubenswrapper[5004]: E1203 15:11:04.613371 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:11:16 crc kubenswrapper[5004]: I1203 15:11:16.613458 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:11:16 crc kubenswrapper[5004]: E1203 15:11:16.614582 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:11:31 crc kubenswrapper[5004]: I1203 15:11:31.612618 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:11:31 crc kubenswrapper[5004]: E1203 15:11:31.613782 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:11:44 crc kubenswrapper[5004]: I1203 15:11:44.613521 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:11:44 crc kubenswrapper[5004]: E1203 15:11:44.614941 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:11:59 crc kubenswrapper[5004]: I1203 15:11:59.612995 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:11:59 crc kubenswrapper[5004]: E1203 15:11:59.613752 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:12:12 crc kubenswrapper[5004]: I1203 15:12:12.614344 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:12:12 crc kubenswrapper[5004]: E1203 15:12:12.615493 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:12:26 crc kubenswrapper[5004]: I1203 15:12:26.612727 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:12:26 crc kubenswrapper[5004]: E1203 15:12:26.613611 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:12:40 crc kubenswrapper[5004]: I1203 15:12:40.613611 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:12:40 crc kubenswrapper[5004]: E1203 15:12:40.615788 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.043422 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:12:49 crc kubenswrapper[5004]: E1203 15:12:49.044326 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="gather" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044339 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="gather" Dec 03 15:12:49 crc kubenswrapper[5004]: E1203 15:12:49.044360 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a8ac34-f8d9-4547-a922-1796568b0dc8" containerName="container-00" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044366 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a8ac34-f8d9-4547-a922-1796568b0dc8" containerName="container-00" Dec 03 15:12:49 crc kubenswrapper[5004]: E1203 15:12:49.044382 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="copy" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044387 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="copy" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044570 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="copy" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044595 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13dffc5-936b-4314-b0e0-69ef17717995" containerName="gather" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.044611 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a8ac34-f8d9-4547-a922-1796568b0dc8" containerName="container-00" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.046285 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.068563 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.171211 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.171616 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.171738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mff\" (UniqueName: \"kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.273919 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.273998 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mff\" (UniqueName: \"kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.274047 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.274676 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.274798 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.307736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mff\" (UniqueName: \"kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff\") pod \"community-operators-7j6x9\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.365510 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:49 crc kubenswrapper[5004]: I1203 15:12:49.852643 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:12:50 crc kubenswrapper[5004]: I1203 15:12:50.037073 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerStarted","Data":"52a2fec8f7eb0ec31fd183f735a81c9b09d37255c61636875de1f7393747ac64"} Dec 03 15:12:51 crc kubenswrapper[5004]: I1203 15:12:51.048363 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerID="4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa" exitCode=0 Dec 03 15:12:51 crc kubenswrapper[5004]: I1203 15:12:51.048436 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerDied","Data":"4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa"} Dec 03 15:12:51 crc kubenswrapper[5004]: I1203 15:12:51.050932 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:12:52 crc kubenswrapper[5004]: I1203 15:12:52.058682 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerStarted","Data":"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc"} Dec 03 15:12:53 crc kubenswrapper[5004]: I1203 15:12:53.071281 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerID="8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc" exitCode=0 Dec 03 15:12:53 crc kubenswrapper[5004]: I1203 15:12:53.071333 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerDied","Data":"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc"} Dec 03 15:12:55 crc kubenswrapper[5004]: I1203 15:12:55.091214 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerStarted","Data":"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08"} Dec 03 15:12:55 crc kubenswrapper[5004]: I1203 15:12:55.116419 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7j6x9" podStartSLOduration=2.646653372 podStartE2EDuration="6.116398781s" podCreationTimestamp="2025-12-03 15:12:49 +0000 UTC" firstStartedPulling="2025-12-03 15:12:51.050706485 +0000 UTC m=+3983.799676721" lastFinishedPulling="2025-12-03 15:12:54.520451894 +0000 UTC m=+3987.269422130" observedRunningTime="2025-12-03 15:12:55.109252016 +0000 UTC m=+3987.858222262" watchObservedRunningTime="2025-12-03 15:12:55.116398781 +0000 UTC m=+3987.865369017" Dec 03 15:12:55 crc kubenswrapper[5004]: I1203 15:12:55.612997 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:12:55 crc kubenswrapper[5004]: E1203 15:12:55.613241 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:12:59 crc kubenswrapper[5004]: I1203 15:12:59.366235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:59 crc kubenswrapper[5004]: I1203 15:12:59.366910 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:12:59 crc kubenswrapper[5004]: I1203 15:12:59.411829 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:13:00 crc kubenswrapper[5004]: I1203 15:13:00.369740 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:13:00 crc kubenswrapper[5004]: I1203 15:13:00.438541 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.151283 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7j6x9" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="registry-server" containerID="cri-o://9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08" gracePeriod=2 Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.645596 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.842448 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities\") pod \"b4a6f288-f5f2-480c-b409-74d0ed90787a\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.842599 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content\") pod \"b4a6f288-f5f2-480c-b409-74d0ed90787a\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.842835 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4mff\" (UniqueName: \"kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff\") pod \"b4a6f288-f5f2-480c-b409-74d0ed90787a\" (UID: \"b4a6f288-f5f2-480c-b409-74d0ed90787a\") " Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.843477 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities" (OuterVolumeSpecName: "utilities") pod "b4a6f288-f5f2-480c-b409-74d0ed90787a" (UID: "b4a6f288-f5f2-480c-b409-74d0ed90787a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.843622 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.850012 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff" (OuterVolumeSpecName: "kube-api-access-c4mff") pod "b4a6f288-f5f2-480c-b409-74d0ed90787a" (UID: "b4a6f288-f5f2-480c-b409-74d0ed90787a"). InnerVolumeSpecName "kube-api-access-c4mff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.920008 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4a6f288-f5f2-480c-b409-74d0ed90787a" (UID: "b4a6f288-f5f2-480c-b409-74d0ed90787a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.944894 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4mff\" (UniqueName: \"kubernetes.io/projected/b4a6f288-f5f2-480c-b409-74d0ed90787a-kube-api-access-c4mff\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:02 crc kubenswrapper[5004]: I1203 15:13:02.944924 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a6f288-f5f2-480c-b409-74d0ed90787a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.162388 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerID="9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08" exitCode=0 Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.162439 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerDied","Data":"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08"} Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.162445 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j6x9" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.162471 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j6x9" event={"ID":"b4a6f288-f5f2-480c-b409-74d0ed90787a","Type":"ContainerDied","Data":"52a2fec8f7eb0ec31fd183f735a81c9b09d37255c61636875de1f7393747ac64"} Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.162493 5004 scope.go:117] "RemoveContainer" containerID="9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.184496 5004 scope.go:117] "RemoveContainer" containerID="8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.199027 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.208679 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7j6x9"] Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.223274 5004 scope.go:117] "RemoveContainer" containerID="4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.253585 5004 scope.go:117] "RemoveContainer" containerID="9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08" Dec 03 15:13:03 crc kubenswrapper[5004]: E1203 15:13:03.254022 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08\": container with ID starting with 9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08 not found: ID does not exist" containerID="9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.254057 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08"} err="failed to get container status \"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08\": rpc error: code = NotFound desc = could not find container \"9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08\": container with ID starting with 9043d1541848aa2072a7e157f0378ec9c1f5f933a3e6e1ec1c7dc05c3ad28d08 not found: ID does not exist" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.254081 5004 scope.go:117] "RemoveContainer" containerID="8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc" Dec 03 15:13:03 crc kubenswrapper[5004]: E1203 15:13:03.254824 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc\": container with ID starting with 8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc not found: ID does not exist" containerID="8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.254884 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc"} err="failed to get container status \"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc\": rpc error: code = NotFound desc = could not find container \"8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc\": container with ID starting with 8ccca0d6f5f12a6168a2dd967fb802fa8a56fe34e0f004a665ce744fa890d8dc not found: ID does not exist" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.254915 5004 scope.go:117] "RemoveContainer" containerID="4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa" Dec 03 15:13:03 crc kubenswrapper[5004]: E1203 15:13:03.255199 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa\": container with ID starting with 4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa not found: ID does not exist" containerID="4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.255242 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa"} err="failed to get container status \"4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa\": rpc error: code = NotFound desc = could not find container \"4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa\": container with ID starting with 4da4c4958658bf5fe1fa90ad5f7a75de93ba7b198977e6d1e69e63484603d7aa not found: ID does not exist" Dec 03 15:13:03 crc kubenswrapper[5004]: I1203 15:13:03.625424 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" path="/var/lib/kubelet/pods/b4a6f288-f5f2-480c-b409-74d0ed90787a/volumes" Dec 03 15:13:07 crc kubenswrapper[5004]: I1203 15:13:07.630874 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:13:07 crc kubenswrapper[5004]: E1203 15:13:07.631589 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:13:18 crc kubenswrapper[5004]: I1203 15:13:18.612730 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:13:18 crc kubenswrapper[5004]: E1203 15:13:18.613394 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:13:32 crc kubenswrapper[5004]: I1203 15:13:32.613780 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:13:32 crc kubenswrapper[5004]: E1203 15:13:32.614718 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.847905 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:37 crc kubenswrapper[5004]: E1203 15:13:37.849034 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="registry-server" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.849052 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="registry-server" Dec 03 15:13:37 crc kubenswrapper[5004]: E1203 15:13:37.849095 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="extract-content" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.849101 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="extract-content" Dec 03 15:13:37 crc kubenswrapper[5004]: E1203 15:13:37.849114 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="extract-utilities" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.849120 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="extract-utilities" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.849380 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a6f288-f5f2-480c-b409-74d0ed90787a" containerName="registry-server" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.851708 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:37 crc kubenswrapper[5004]: I1203 15:13:37.872724 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.048072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.048151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88st\" (UniqueName: \"kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.048171 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.159884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.160164 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88st\" (UniqueName: \"kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.160219 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.165139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.167029 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.200408 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88st\" (UniqueName: \"kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st\") pod \"certified-operators-qrqnw\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.493933 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:38 crc kubenswrapper[5004]: I1203 15:13:38.962178 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:39 crc kubenswrapper[5004]: I1203 15:13:39.709979 5004 generic.go:334] "Generic (PLEG): container finished" podID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerID="0829da98d3713bd29e52e3eb5a9914be41d7dd8e4d388147d565c24258a2bfc8" exitCode=0 Dec 03 15:13:39 crc kubenswrapper[5004]: I1203 15:13:39.710266 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerDied","Data":"0829da98d3713bd29e52e3eb5a9914be41d7dd8e4d388147d565c24258a2bfc8"} Dec 03 15:13:39 crc kubenswrapper[5004]: I1203 15:13:39.710299 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerStarted","Data":"554e1572a0ec451a8bab8752d8c6197bcdf966bfb540c6858a8ed5f46728a956"} Dec 03 15:13:40 crc kubenswrapper[5004]: I1203 15:13:40.723675 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerStarted","Data":"d5d3759d502448d1f17c58b0469ce7a8b6654ca12ec5de3c6fcc2bdf6a4ccc5f"} Dec 03 15:13:41 crc kubenswrapper[5004]: I1203 15:13:41.739286 5004 generic.go:334] "Generic (PLEG): container finished" podID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerID="d5d3759d502448d1f17c58b0469ce7a8b6654ca12ec5de3c6fcc2bdf6a4ccc5f" exitCode=0 Dec 03 15:13:41 crc kubenswrapper[5004]: I1203 15:13:41.739398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerDied","Data":"d5d3759d502448d1f17c58b0469ce7a8b6654ca12ec5de3c6fcc2bdf6a4ccc5f"} Dec 03 15:13:42 crc kubenswrapper[5004]: I1203 15:13:42.754904 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerStarted","Data":"fc7d73c5bab17ca2111abda972d2ba1e3cfc2ffd1df03332fe4de16a35de1033"} Dec 03 15:13:42 crc kubenswrapper[5004]: I1203 15:13:42.793080 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrqnw" podStartSLOduration=3.319465252 podStartE2EDuration="5.793052232s" podCreationTimestamp="2025-12-03 15:13:37 +0000 UTC" firstStartedPulling="2025-12-03 15:13:39.712424067 +0000 UTC m=+4032.461394313" lastFinishedPulling="2025-12-03 15:13:42.186011017 +0000 UTC m=+4034.934981293" observedRunningTime="2025-12-03 15:13:42.778279039 +0000 UTC m=+4035.527249305" watchObservedRunningTime="2025-12-03 15:13:42.793052232 +0000 UTC m=+4035.542022508" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.609561 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.612839 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.646916 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.737825 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgc9\" (UniqueName: \"kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.738148 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.738338 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.840196 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.840288 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.840425 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgc9\" (UniqueName: \"kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.840787 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.840844 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.861168 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgc9\" (UniqueName: \"kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9\") pod \"redhat-operators-w8cnd\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:46 crc kubenswrapper[5004]: I1203 15:13:46.941024 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:47 crc kubenswrapper[5004]: I1203 15:13:47.413965 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:13:47 crc kubenswrapper[5004]: I1203 15:13:47.628391 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:13:47 crc kubenswrapper[5004]: E1203 15:13:47.629039 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:13:47 crc kubenswrapper[5004]: I1203 15:13:47.805717 5004 generic.go:334] "Generic (PLEG): container finished" podID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerID="5cf5186b948e0990ceec7a1ec81105abea841d261e14b80ba7bf5cd07e8e19b3" exitCode=0 Dec 03 15:13:47 crc kubenswrapper[5004]: I1203 15:13:47.805769 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerDied","Data":"5cf5186b948e0990ceec7a1ec81105abea841d261e14b80ba7bf5cd07e8e19b3"} Dec 03 15:13:47 crc kubenswrapper[5004]: I1203 15:13:47.806009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerStarted","Data":"332ec6de63dd9d03e9383131435bb03036719543bcd86789a2015097a5b07192"} Dec 03 15:13:48 crc kubenswrapper[5004]: I1203 15:13:48.499068 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:48 crc kubenswrapper[5004]: I1203 15:13:48.499808 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:48 crc kubenswrapper[5004]: I1203 15:13:48.579344 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:48 crc kubenswrapper[5004]: I1203 15:13:48.818092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerStarted","Data":"ca3079ca3424105d28788065cd004a9794e0bb0727d5c260d4a7f9e0e58827b1"} Dec 03 15:13:48 crc kubenswrapper[5004]: I1203 15:13:48.869933 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:49 crc kubenswrapper[5004]: I1203 15:13:49.827470 5004 generic.go:334] "Generic (PLEG): container finished" podID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerID="ca3079ca3424105d28788065cd004a9794e0bb0727d5c260d4a7f9e0e58827b1" exitCode=0 Dec 03 15:13:49 crc kubenswrapper[5004]: I1203 15:13:49.828463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerDied","Data":"ca3079ca3424105d28788065cd004a9794e0bb0727d5c260d4a7f9e0e58827b1"} Dec 03 15:13:50 crc kubenswrapper[5004]: I1203 15:13:50.838597 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerStarted","Data":"fdecd13e767a1867ac9f3f16fd68edd36ed6bf37b27374bbf2d9726edd7ca818"} Dec 03 15:13:50 crc kubenswrapper[5004]: I1203 15:13:50.857621 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8cnd" podStartSLOduration=2.445788478 podStartE2EDuration="4.857601989s" podCreationTimestamp="2025-12-03 15:13:46 +0000 UTC" firstStartedPulling="2025-12-03 15:13:47.807035905 +0000 UTC m=+4040.556006141" lastFinishedPulling="2025-12-03 15:13:50.218849396 +0000 UTC m=+4042.967819652" observedRunningTime="2025-12-03 15:13:50.857069644 +0000 UTC m=+4043.606039880" watchObservedRunningTime="2025-12-03 15:13:50.857601989 +0000 UTC m=+4043.606572225" Dec 03 15:13:53 crc kubenswrapper[5004]: I1203 15:13:53.003099 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:53 crc kubenswrapper[5004]: I1203 15:13:53.003790 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrqnw" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="registry-server" containerID="cri-o://fc7d73c5bab17ca2111abda972d2ba1e3cfc2ffd1df03332fe4de16a35de1033" gracePeriod=2 Dec 03 15:13:53 crc kubenswrapper[5004]: I1203 15:13:53.864151 5004 generic.go:334] "Generic (PLEG): container finished" podID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerID="fc7d73c5bab17ca2111abda972d2ba1e3cfc2ffd1df03332fe4de16a35de1033" exitCode=0 Dec 03 15:13:53 crc kubenswrapper[5004]: I1203 15:13:53.864192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerDied","Data":"fc7d73c5bab17ca2111abda972d2ba1e3cfc2ffd1df03332fe4de16a35de1033"} Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.006926 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.099723 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities\") pod \"a61be5b7-1368-4e63-8c0c-634a662c0890\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.099845 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content\") pod \"a61be5b7-1368-4e63-8c0c-634a662c0890\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.099947 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88st\" (UniqueName: \"kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st\") pod \"a61be5b7-1368-4e63-8c0c-634a662c0890\" (UID: \"a61be5b7-1368-4e63-8c0c-634a662c0890\") " Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.100632 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities" (OuterVolumeSpecName: "utilities") pod "a61be5b7-1368-4e63-8c0c-634a662c0890" (UID: "a61be5b7-1368-4e63-8c0c-634a662c0890"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.108261 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st" (OuterVolumeSpecName: "kube-api-access-n88st") pod "a61be5b7-1368-4e63-8c0c-634a662c0890" (UID: "a61be5b7-1368-4e63-8c0c-634a662c0890"). InnerVolumeSpecName "kube-api-access-n88st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.156847 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a61be5b7-1368-4e63-8c0c-634a662c0890" (UID: "a61be5b7-1368-4e63-8c0c-634a662c0890"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.202792 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88st\" (UniqueName: \"kubernetes.io/projected/a61be5b7-1368-4e63-8c0c-634a662c0890-kube-api-access-n88st\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.202824 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.202853 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a61be5b7-1368-4e63-8c0c-634a662c0890-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.879089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqnw" event={"ID":"a61be5b7-1368-4e63-8c0c-634a662c0890","Type":"ContainerDied","Data":"554e1572a0ec451a8bab8752d8c6197bcdf966bfb540c6858a8ed5f46728a956"} Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.879143 5004 scope.go:117] "RemoveContainer" containerID="fc7d73c5bab17ca2111abda972d2ba1e3cfc2ffd1df03332fe4de16a35de1033" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.879284 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqnw" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.905193 5004 scope.go:117] "RemoveContainer" containerID="d5d3759d502448d1f17c58b0469ce7a8b6654ca12ec5de3c6fcc2bdf6a4ccc5f" Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.918001 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.929576 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrqnw"] Dec 03 15:13:54 crc kubenswrapper[5004]: I1203 15:13:54.946976 5004 scope.go:117] "RemoveContainer" containerID="0829da98d3713bd29e52e3eb5a9914be41d7dd8e4d388147d565c24258a2bfc8" Dec 03 15:13:55 crc kubenswrapper[5004]: I1203 15:13:55.629599 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" path="/var/lib/kubelet/pods/a61be5b7-1368-4e63-8c0c-634a662c0890/volumes" Dec 03 15:13:56 crc kubenswrapper[5004]: I1203 15:13:56.941844 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:56 crc kubenswrapper[5004]: I1203 15:13:56.942712 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:56 crc kubenswrapper[5004]: I1203 15:13:56.991246 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:57 crc kubenswrapper[5004]: I1203 15:13:57.955154 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:13:58 crc kubenswrapper[5004]: I1203 15:13:58.804234 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:13:59 crc kubenswrapper[5004]: I1203 15:13:59.922523 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8cnd" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="registry-server" containerID="cri-o://fdecd13e767a1867ac9f3f16fd68edd36ed6bf37b27374bbf2d9726edd7ca818" gracePeriod=2 Dec 03 15:14:00 crc kubenswrapper[5004]: I1203 15:14:00.613201 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:14:00 crc kubenswrapper[5004]: E1203 15:14:00.613456 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:14:00 crc kubenswrapper[5004]: I1203 15:14:00.937373 5004 generic.go:334] "Generic (PLEG): container finished" podID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerID="fdecd13e767a1867ac9f3f16fd68edd36ed6bf37b27374bbf2d9726edd7ca818" exitCode=0 Dec 03 15:14:00 crc kubenswrapper[5004]: I1203 15:14:00.937461 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerDied","Data":"fdecd13e767a1867ac9f3f16fd68edd36ed6bf37b27374bbf2d9726edd7ca818"} Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.555415 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.645172 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content\") pod \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.645265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfgc9\" (UniqueName: \"kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9\") pod \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.645393 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities\") pod \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\" (UID: \"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b\") " Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.646553 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities" (OuterVolumeSpecName: "utilities") pod "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" (UID: "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.652026 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9" (OuterVolumeSpecName: "kube-api-access-pfgc9") pod "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" (UID: "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b"). InnerVolumeSpecName "kube-api-access-pfgc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.747836 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.747878 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfgc9\" (UniqueName: \"kubernetes.io/projected/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-kube-api-access-pfgc9\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.748181 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" (UID: "d3e8a024-c6a9-44c6-96a6-d6700bf8de1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.849946 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.951941 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8cnd" event={"ID":"d3e8a024-c6a9-44c6-96a6-d6700bf8de1b","Type":"ContainerDied","Data":"332ec6de63dd9d03e9383131435bb03036719543bcd86789a2015097a5b07192"} Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.952011 5004 scope.go:117] "RemoveContainer" containerID="fdecd13e767a1867ac9f3f16fd68edd36ed6bf37b27374bbf2d9726edd7ca818" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.954015 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8cnd" Dec 03 15:14:01 crc kubenswrapper[5004]: I1203 15:14:01.981631 5004 scope.go:117] "RemoveContainer" containerID="ca3079ca3424105d28788065cd004a9794e0bb0727d5c260d4a7f9e0e58827b1" Dec 03 15:14:02 crc kubenswrapper[5004]: I1203 15:14:02.009706 5004 scope.go:117] "RemoveContainer" containerID="5cf5186b948e0990ceec7a1ec81105abea841d261e14b80ba7bf5cd07e8e19b3" Dec 03 15:14:02 crc kubenswrapper[5004]: I1203 15:14:02.015664 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:14:02 crc kubenswrapper[5004]: I1203 15:14:02.024539 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8cnd"] Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.546126 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zlkfg/must-gather-xrhp6"] Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.546946 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.546963 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.546995 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="extract-content" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547003 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="extract-content" Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.547026 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="extract-content" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547033 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="extract-content" Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.547043 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="extract-utilities" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547050 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="extract-utilities" Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.547068 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547075 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: E1203 15:14:03.547092 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="extract-utilities" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547100 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="extract-utilities" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547331 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.547345 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61be5b7-1368-4e63-8c0c-634a662c0890" containerName="registry-server" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.548618 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.551556 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zlkfg"/"kube-root-ca.crt" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.551843 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zlkfg"/"openshift-service-ca.crt" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.564947 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zlkfg/must-gather-xrhp6"] Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.600337 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.600434 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2qth\" (UniqueName: \"kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.641026 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e8a024-c6a9-44c6-96a6-d6700bf8de1b" path="/var/lib/kubelet/pods/d3e8a024-c6a9-44c6-96a6-d6700bf8de1b/volumes" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.702129 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.702578 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.702682 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2qth\" (UniqueName: \"kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.721952 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2qth\" (UniqueName: \"kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth\") pod \"must-gather-xrhp6\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:03 crc kubenswrapper[5004]: I1203 15:14:03.872740 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:14:04 crc kubenswrapper[5004]: I1203 15:14:04.329975 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zlkfg/must-gather-xrhp6"] Dec 03 15:14:04 crc kubenswrapper[5004]: I1203 15:14:04.996936 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" event={"ID":"e024fbff-5e92-4fc3-b5d8-31a69957a91f","Type":"ContainerStarted","Data":"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069"} Dec 03 15:14:04 crc kubenswrapper[5004]: I1203 15:14:04.997284 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" event={"ID":"e024fbff-5e92-4fc3-b5d8-31a69957a91f","Type":"ContainerStarted","Data":"18f1294bab49eaee52f354203f7e018a744023c054937a06454eccc9f98c5dbc"} Dec 03 15:14:06 crc kubenswrapper[5004]: I1203 15:14:06.009805 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" event={"ID":"e024fbff-5e92-4fc3-b5d8-31a69957a91f","Type":"ContainerStarted","Data":"50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a"} Dec 03 15:14:06 crc kubenswrapper[5004]: I1203 15:14:06.031186 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" podStartSLOduration=3.031164078 podStartE2EDuration="3.031164078s" podCreationTimestamp="2025-12-03 15:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:14:06.026444273 +0000 UTC m=+4058.775414509" watchObservedRunningTime="2025-12-03 15:14:06.031164078 +0000 UTC m=+4058.780134324" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.402292 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-tbv9g"] Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.403917 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.406066 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zlkfg"/"default-dockercfg-f5ccs" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.408133 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvtc\" (UniqueName: \"kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.408216 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.509995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.510137 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.510174 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvtc\" (UniqueName: \"kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.534575 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvtc\" (UniqueName: \"kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc\") pod \"crc-debug-tbv9g\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:08 crc kubenswrapper[5004]: I1203 15:14:08.722654 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:09 crc kubenswrapper[5004]: I1203 15:14:09.036361 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" event={"ID":"d69a3e7d-13c6-444a-8dd7-76c026d5eb21","Type":"ContainerStarted","Data":"4d2360cb38a65ea21ae5936c119d04abfcb34c4b6f26f0c08bca047fc3ba1b9b"} Dec 03 15:14:10 crc kubenswrapper[5004]: I1203 15:14:10.052271 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" event={"ID":"d69a3e7d-13c6-444a-8dd7-76c026d5eb21","Type":"ContainerStarted","Data":"3339af3466aa2d99e8c73e8a2122b4e74fd89f8465126a038e565357e1ea24cc"} Dec 03 15:14:10 crc kubenswrapper[5004]: I1203 15:14:10.068785 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" podStartSLOduration=2.068759659 podStartE2EDuration="2.068759659s" podCreationTimestamp="2025-12-03 15:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 15:14:10.063530479 +0000 UTC m=+4062.812500715" watchObservedRunningTime="2025-12-03 15:14:10.068759659 +0000 UTC m=+4062.817729905" Dec 03 15:14:12 crc kubenswrapper[5004]: I1203 15:14:12.613451 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:14:12 crc kubenswrapper[5004]: E1203 15:14:12.614139 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:14:27 crc kubenswrapper[5004]: I1203 15:14:27.624000 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:14:27 crc kubenswrapper[5004]: E1203 15:14:27.625029 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:14:38 crc kubenswrapper[5004]: I1203 15:14:38.613427 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:14:38 crc kubenswrapper[5004]: E1203 15:14:38.614231 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:14:43 crc kubenswrapper[5004]: I1203 15:14:43.342412 5004 generic.go:334] "Generic (PLEG): container finished" podID="d69a3e7d-13c6-444a-8dd7-76c026d5eb21" containerID="3339af3466aa2d99e8c73e8a2122b4e74fd89f8465126a038e565357e1ea24cc" exitCode=0 Dec 03 15:14:43 crc kubenswrapper[5004]: I1203 15:14:43.342615 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" event={"ID":"d69a3e7d-13c6-444a-8dd7-76c026d5eb21","Type":"ContainerDied","Data":"3339af3466aa2d99e8c73e8a2122b4e74fd89f8465126a038e565357e1ea24cc"} Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.808580 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.837032 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-tbv9g"] Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.845394 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-tbv9g"] Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.963448 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvtc\" (UniqueName: \"kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc\") pod \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.963550 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host\") pod \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\" (UID: \"d69a3e7d-13c6-444a-8dd7-76c026d5eb21\") " Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.963690 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host" (OuterVolumeSpecName: "host") pod "d69a3e7d-13c6-444a-8dd7-76c026d5eb21" (UID: "d69a3e7d-13c6-444a-8dd7-76c026d5eb21"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.964134 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:44 crc kubenswrapper[5004]: I1203 15:14:44.969779 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc" (OuterVolumeSpecName: "kube-api-access-snvtc") pod "d69a3e7d-13c6-444a-8dd7-76c026d5eb21" (UID: "d69a3e7d-13c6-444a-8dd7-76c026d5eb21"). InnerVolumeSpecName "kube-api-access-snvtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:14:45 crc kubenswrapper[5004]: I1203 15:14:45.066407 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snvtc\" (UniqueName: \"kubernetes.io/projected/d69a3e7d-13c6-444a-8dd7-76c026d5eb21-kube-api-access-snvtc\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:45 crc kubenswrapper[5004]: I1203 15:14:45.361523 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2360cb38a65ea21ae5936c119d04abfcb34c4b6f26f0c08bca047fc3ba1b9b" Dec 03 15:14:45 crc kubenswrapper[5004]: I1203 15:14:45.361583 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-tbv9g" Dec 03 15:14:45 crc kubenswrapper[5004]: I1203 15:14:45.624174 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69a3e7d-13c6-444a-8dd7-76c026d5eb21" path="/var/lib/kubelet/pods/d69a3e7d-13c6-444a-8dd7-76c026d5eb21/volumes" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.027158 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-mr7l4"] Dec 03 15:14:46 crc kubenswrapper[5004]: E1203 15:14:46.027637 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69a3e7d-13c6-444a-8dd7-76c026d5eb21" containerName="container-00" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.027657 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69a3e7d-13c6-444a-8dd7-76c026d5eb21" containerName="container-00" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.027952 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69a3e7d-13c6-444a-8dd7-76c026d5eb21" containerName="container-00" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.028751 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.033219 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zlkfg"/"default-dockercfg-f5ccs" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.187057 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnpdx\" (UniqueName: \"kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.187172 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.288890 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnpdx\" (UniqueName: \"kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.289008 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.289101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.508551 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnpdx\" (UniqueName: \"kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx\") pod \"crc-debug-mr7l4\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:46 crc kubenswrapper[5004]: I1203 15:14:46.649432 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:47 crc kubenswrapper[5004]: I1203 15:14:47.380490 5004 generic.go:334] "Generic (PLEG): container finished" podID="a15a968e-a9b4-4c26-9843-be64217a6894" containerID="48f87d744aab4f1492061930a8d4c81417e3a39d111a2b1a02998ff0e2c40831" exitCode=0 Dec 03 15:14:47 crc kubenswrapper[5004]: I1203 15:14:47.380595 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" event={"ID":"a15a968e-a9b4-4c26-9843-be64217a6894","Type":"ContainerDied","Data":"48f87d744aab4f1492061930a8d4c81417e3a39d111a2b1a02998ff0e2c40831"} Dec 03 15:14:47 crc kubenswrapper[5004]: I1203 15:14:47.381060 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" event={"ID":"a15a968e-a9b4-4c26-9843-be64217a6894","Type":"ContainerStarted","Data":"165d64eb54fb5a736579b224f588281f3d7553197d548ab2efac0e569c9af25d"} Dec 03 15:14:47 crc kubenswrapper[5004]: I1203 15:14:47.850725 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-mr7l4"] Dec 03 15:14:47 crc kubenswrapper[5004]: I1203 15:14:47.858342 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-mr7l4"] Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.524871 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.630101 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host\") pod \"a15a968e-a9b4-4c26-9843-be64217a6894\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.630149 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnpdx\" (UniqueName: \"kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx\") pod \"a15a968e-a9b4-4c26-9843-be64217a6894\" (UID: \"a15a968e-a9b4-4c26-9843-be64217a6894\") " Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.630271 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host" (OuterVolumeSpecName: "host") pod "a15a968e-a9b4-4c26-9843-be64217a6894" (UID: "a15a968e-a9b4-4c26-9843-be64217a6894"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.630554 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a15a968e-a9b4-4c26-9843-be64217a6894-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.645146 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx" (OuterVolumeSpecName: "kube-api-access-lnpdx") pod "a15a968e-a9b4-4c26-9843-be64217a6894" (UID: "a15a968e-a9b4-4c26-9843-be64217a6894"). InnerVolumeSpecName "kube-api-access-lnpdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:14:48 crc kubenswrapper[5004]: I1203 15:14:48.732614 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnpdx\" (UniqueName: \"kubernetes.io/projected/a15a968e-a9b4-4c26-9843-be64217a6894-kube-api-access-lnpdx\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.049115 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-kg7nl"] Dec 03 15:14:49 crc kubenswrapper[5004]: E1203 15:14:49.049500 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15a968e-a9b4-4c26-9843-be64217a6894" containerName="container-00" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.049512 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15a968e-a9b4-4c26-9843-be64217a6894" containerName="container-00" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.049685 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15a968e-a9b4-4c26-9843-be64217a6894" containerName="container-00" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.050316 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.139886 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.139986 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87b7w\" (UniqueName: \"kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.242109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.242189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87b7w\" (UniqueName: \"kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.242342 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.266298 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87b7w\" (UniqueName: \"kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w\") pod \"crc-debug-kg7nl\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.368167 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.414847 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165d64eb54fb5a736579b224f588281f3d7553197d548ab2efac0e569c9af25d" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.414954 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-mr7l4" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.614227 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:14:49 crc kubenswrapper[5004]: E1203 15:14:49.614503 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:14:49 crc kubenswrapper[5004]: I1203 15:14:49.652296 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15a968e-a9b4-4c26-9843-be64217a6894" path="/var/lib/kubelet/pods/a15a968e-a9b4-4c26-9843-be64217a6894/volumes" Dec 03 15:14:49 crc kubenswrapper[5004]: E1203 15:14:49.938846 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4a67149_870e_4476_8d92_c1bb25173d46.slice/crio-conmon-8b5768f3dc2906f262f7b7b0cfb237d318031a0a81a6943cdc82364ecccf1fea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4a67149_870e_4476_8d92_c1bb25173d46.slice/crio-8b5768f3dc2906f262f7b7b0cfb237d318031a0a81a6943cdc82364ecccf1fea.scope\": RecentStats: unable to find data in memory cache]" Dec 03 15:14:50 crc kubenswrapper[5004]: I1203 15:14:50.423484 5004 generic.go:334] "Generic (PLEG): container finished" podID="c4a67149-870e-4476-8d92-c1bb25173d46" containerID="8b5768f3dc2906f262f7b7b0cfb237d318031a0a81a6943cdc82364ecccf1fea" exitCode=0 Dec 03 15:14:50 crc kubenswrapper[5004]: I1203 15:14:50.423531 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" event={"ID":"c4a67149-870e-4476-8d92-c1bb25173d46","Type":"ContainerDied","Data":"8b5768f3dc2906f262f7b7b0cfb237d318031a0a81a6943cdc82364ecccf1fea"} Dec 03 15:14:50 crc kubenswrapper[5004]: I1203 15:14:50.423556 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" event={"ID":"c4a67149-870e-4476-8d92-c1bb25173d46","Type":"ContainerStarted","Data":"df650caa3aa715594bd1e8b82e84e106aa9066d2876e68d0d16491d416dac43f"} Dec 03 15:14:50 crc kubenswrapper[5004]: I1203 15:14:50.470725 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-kg7nl"] Dec 03 15:14:50 crc kubenswrapper[5004]: I1203 15:14:50.480308 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zlkfg/crc-debug-kg7nl"] Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.560415 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.693980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87b7w\" (UniqueName: \"kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w\") pod \"c4a67149-870e-4476-8d92-c1bb25173d46\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.694035 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host\") pod \"c4a67149-870e-4476-8d92-c1bb25173d46\" (UID: \"c4a67149-870e-4476-8d92-c1bb25173d46\") " Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.694226 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host" (OuterVolumeSpecName: "host") pod "c4a67149-870e-4476-8d92-c1bb25173d46" (UID: "c4a67149-870e-4476-8d92-c1bb25173d46"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.694632 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4a67149-870e-4476-8d92-c1bb25173d46-host\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.704248 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w" (OuterVolumeSpecName: "kube-api-access-87b7w") pod "c4a67149-870e-4476-8d92-c1bb25173d46" (UID: "c4a67149-870e-4476-8d92-c1bb25173d46"). InnerVolumeSpecName "kube-api-access-87b7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:14:51 crc kubenswrapper[5004]: I1203 15:14:51.796584 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87b7w\" (UniqueName: \"kubernetes.io/projected/c4a67149-870e-4476-8d92-c1bb25173d46-kube-api-access-87b7w\") on node \"crc\" DevicePath \"\"" Dec 03 15:14:52 crc kubenswrapper[5004]: I1203 15:14:52.442347 5004 scope.go:117] "RemoveContainer" containerID="8b5768f3dc2906f262f7b7b0cfb237d318031a0a81a6943cdc82364ecccf1fea" Dec 03 15:14:52 crc kubenswrapper[5004]: I1203 15:14:52.442413 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/crc-debug-kg7nl" Dec 03 15:14:53 crc kubenswrapper[5004]: I1203 15:14:53.623568 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a67149-870e-4476-8d92-c1bb25173d46" path="/var/lib/kubelet/pods/c4a67149-870e-4476-8d92-c1bb25173d46/volumes" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.181510 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj"] Dec 03 15:15:00 crc kubenswrapper[5004]: E1203 15:15:00.182551 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a67149-870e-4476-8d92-c1bb25173d46" containerName="container-00" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.182568 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a67149-870e-4476-8d92-c1bb25173d46" containerName="container-00" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.182775 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a67149-870e-4476-8d92-c1bb25173d46" containerName="container-00" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.183546 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.185674 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.187140 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.193905 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj"] Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.346738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.346805 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqrm\" (UniqueName: \"kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.347582 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.449341 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.449518 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.449569 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqrm\" (UniqueName: \"kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.451069 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.462541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.491563 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqrm\" (UniqueName: \"kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm\") pod \"collect-profiles-29412915-wm4vj\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:00 crc kubenswrapper[5004]: I1203 15:15:00.507338 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:01 crc kubenswrapper[5004]: I1203 15:15:01.063429 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj"] Dec 03 15:15:01 crc kubenswrapper[5004]: I1203 15:15:01.545008 5004 generic.go:334] "Generic (PLEG): container finished" podID="fddb107b-d7e9-405f-b79c-d7abadf7735b" containerID="3a5117a70071196a567af20a3268dfa321910b826f9662d9c7ef2163d734b499" exitCode=0 Dec 03 15:15:01 crc kubenswrapper[5004]: I1203 15:15:01.545070 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" event={"ID":"fddb107b-d7e9-405f-b79c-d7abadf7735b","Type":"ContainerDied","Data":"3a5117a70071196a567af20a3268dfa321910b826f9662d9c7ef2163d734b499"} Dec 03 15:15:01 crc kubenswrapper[5004]: I1203 15:15:01.545217 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" event={"ID":"fddb107b-d7e9-405f-b79c-d7abadf7735b","Type":"ContainerStarted","Data":"b239063053d057388158d158a96f6cab67dacdbb697789143ce787fbebb92d1b"} Dec 03 15:15:01 crc kubenswrapper[5004]: I1203 15:15:01.614020 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:15:01 crc kubenswrapper[5004]: E1203 15:15:01.617455 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:15:02 crc kubenswrapper[5004]: I1203 15:15:02.916953 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.098690 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume\") pod \"fddb107b-d7e9-405f-b79c-d7abadf7735b\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.099011 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume\") pod \"fddb107b-d7e9-405f-b79c-d7abadf7735b\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.099046 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqrm\" (UniqueName: \"kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm\") pod \"fddb107b-d7e9-405f-b79c-d7abadf7735b\" (UID: \"fddb107b-d7e9-405f-b79c-d7abadf7735b\") " Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.099689 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume" (OuterVolumeSpecName: "config-volume") pod "fddb107b-d7e9-405f-b79c-d7abadf7735b" (UID: "fddb107b-d7e9-405f-b79c-d7abadf7735b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.104985 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fddb107b-d7e9-405f-b79c-d7abadf7735b" (UID: "fddb107b-d7e9-405f-b79c-d7abadf7735b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.108029 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm" (OuterVolumeSpecName: "kube-api-access-wnqrm") pod "fddb107b-d7e9-405f-b79c-d7abadf7735b" (UID: "fddb107b-d7e9-405f-b79c-d7abadf7735b"). InnerVolumeSpecName "kube-api-access-wnqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.201203 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fddb107b-d7e9-405f-b79c-d7abadf7735b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.201237 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fddb107b-d7e9-405f-b79c-d7abadf7735b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.201249 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqrm\" (UniqueName: \"kubernetes.io/projected/fddb107b-d7e9-405f-b79c-d7abadf7735b-kube-api-access-wnqrm\") on node \"crc\" DevicePath \"\"" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.576259 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" event={"ID":"fddb107b-d7e9-405f-b79c-d7abadf7735b","Type":"ContainerDied","Data":"b239063053d057388158d158a96f6cab67dacdbb697789143ce787fbebb92d1b"} Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.576301 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b239063053d057388158d158a96f6cab67dacdbb697789143ce787fbebb92d1b" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.576325 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412915-wm4vj" Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.992392 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt"] Dec 03 15:15:03 crc kubenswrapper[5004]: I1203 15:15:03.999469 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412870-k2ptt"] Dec 03 15:15:05 crc kubenswrapper[5004]: I1203 15:15:05.625666 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b75fdb-1d0a-4b9b-a615-57cea78634da" path="/var/lib/kubelet/pods/c6b75fdb-1d0a-4b9b-a615-57cea78634da/volumes" Dec 03 15:15:15 crc kubenswrapper[5004]: I1203 15:15:15.613552 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:15:15 crc kubenswrapper[5004]: E1203 15:15:15.614253 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:15:18 crc kubenswrapper[5004]: I1203 15:15:18.577038 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f76744786-jfgf7_939c0a06-65e2-45ea-b58d-7d4cc431207b/barbican-api/0.log" Dec 03 15:15:18 crc kubenswrapper[5004]: I1203 15:15:18.674645 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f76744786-jfgf7_939c0a06-65e2-45ea-b58d-7d4cc431207b/barbican-api-log/0.log" Dec 03 15:15:18 crc kubenswrapper[5004]: I1203 15:15:18.838879 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c799f8fcd-gg559_c4501cb8-8287-4e9d-83b2-858fcb7c431c/barbican-keystone-listener/0.log" Dec 03 15:15:18 crc kubenswrapper[5004]: I1203 15:15:18.845256 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c799f8fcd-gg559_c4501cb8-8287-4e9d-83b2-858fcb7c431c/barbican-keystone-listener-log/0.log" Dec 03 15:15:18 crc kubenswrapper[5004]: I1203 15:15:18.975512 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56979bf587-swwll_033d65ef-e917-445c-9c56-cffb8b328dbf/barbican-worker/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.031095 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56979bf587-swwll_033d65ef-e917-445c-9c56-cffb8b328dbf/barbican-worker-log/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.149552 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zwpm4_7a8c5468-695c-4238-9cae-3b010f6987ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.280769 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/ceilometer-central-agent/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.385842 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/proxy-httpd/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.417179 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/ceilometer-notification-agent/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.433643 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbee9411-e6cf-4d99-89f8-788a0529e8e2/sg-core/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.604415 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e715befa-4ae4-4466-beb4-ee8939e3bb86/cinder-api/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.611513 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e715befa-4ae4-4466-beb4-ee8939e3bb86/cinder-api-log/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.853799 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7502099c-9fa6-4071-8ce6-4471b9f44f78/cinder-scheduler/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.856117 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7502099c-9fa6-4071-8ce6-4471b9f44f78/probe/0.log" Dec 03 15:15:19 crc kubenswrapper[5004]: I1203 15:15:19.876989 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-gvfbv_632aa0c1-b525-45af-8254-2f0f0dc57c43/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.060097 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sqfng_652a6191-a7f2-47a8-9f26-48137e58ce1b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.130378 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/init/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.252562 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/init/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.342796 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-78lk4_cccca89a-106f-4827-b398-81f1459b6648/dnsmasq-dns/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.359232 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p6vmt_faf69ec7-959a-404b-9bae-24bc3c528c28/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.603784 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f95292-1d71-478d-ab12-138e2b34bd3f/glance-log/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.613620 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2f95292-1d71-478d-ab12-138e2b34bd3f/glance-httpd/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.703620 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3959efd9-7c4e-43b5-b73a-9b05ec3fb59c/glance-httpd/0.log" Dec 03 15:15:20 crc kubenswrapper[5004]: I1203 15:15:20.782401 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3959efd9-7c4e-43b5-b73a-9b05ec3fb59c/glance-log/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.026198 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79df97d86b-4dr9p_e0f1c734-5c6e-4f15-8f11-1e3c1da2d880/horizon/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.074671 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-wx45m_4f32dcbb-677a-48e6-9c25-eaec1655a155/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.311957 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9ggz8_5de0359a-b8f8-4989-8739-ee565cd596fe/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.315609 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79df97d86b-4dr9p_e0f1c734-5c6e-4f15-8f11-1e3c1da2d880/horizon-log/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.568551 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412901-5c859_ad605cec-0786-4e4e-a1f2-9626a34e39c8/keystone-cron/0.log" Dec 03 15:15:21 crc kubenswrapper[5004]: I1203 15:15:21.692307 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-65f67fcd5d-5p75z_7efdab5a-a074-4ce4-bcc0-b2b8481b886c/keystone-api/0.log" Dec 03 15:15:22 crc kubenswrapper[5004]: I1203 15:15:22.014942 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_70b42ea8-681a-44cb-a494-2093b925d015/kube-state-metrics/0.log" Dec 03 15:15:22 crc kubenswrapper[5004]: I1203 15:15:22.098182 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-22sj5_3e3f3f7f-8810-4c7f-b3b0-975700874959/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:22 crc kubenswrapper[5004]: I1203 15:15:22.384198 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66456bfc4f-v6lrf_5c21b585-fe01-4f87-9a60-1df17f266659/neutron-httpd/0.log" Dec 03 15:15:22 crc kubenswrapper[5004]: I1203 15:15:22.445727 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-66456bfc4f-v6lrf_5c21b585-fe01-4f87-9a60-1df17f266659/neutron-api/0.log" Dec 03 15:15:22 crc kubenswrapper[5004]: I1203 15:15:22.457386 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-chv9c_752f8ea2-1e21-4ff4-aac9-4f1a5f662561/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.042140 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_489bba73-88c3-42e0-ad06-ee95a6073263/nova-api-log/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.116148 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d73689af-1cf5-4846-8e52-c34bce039ca7/nova-cell0-conductor-conductor/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.311185 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_234c4127-5836-4628-a426-2644c4df71a1/nova-cell1-conductor-conductor/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.470396 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_489bba73-88c3-42e0-ad06-ee95a6073263/nova-api-api/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.509253 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_182237d5-f265-4577-8b9a-51f4e2a64a6a/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.598312 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ffvrz_32a75e28-35af-4a42-ae5c-ac1a24ba78ee/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:23 crc kubenswrapper[5004]: I1203 15:15:23.817758 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a010cac3-4f37-4ffd-8627-5329e566d91a/nova-metadata-log/0.log" Dec 03 15:15:24 crc kubenswrapper[5004]: I1203 15:15:24.104089 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/mysql-bootstrap/0.log" Dec 03 15:15:24 crc kubenswrapper[5004]: I1203 15:15:24.226558 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c4e5958b-4572-4965-8948-89fc51a2c486/nova-scheduler-scheduler/0.log" Dec 03 15:15:24 crc kubenswrapper[5004]: I1203 15:15:24.705512 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/galera/0.log" Dec 03 15:15:24 crc kubenswrapper[5004]: I1203 15:15:24.749394 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_83c3d56e-3bcd-407c-97e1-113485660567/mysql-bootstrap/0.log" Dec 03 15:15:24 crc kubenswrapper[5004]: I1203 15:15:24.872492 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/mysql-bootstrap/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.122685 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/mysql-bootstrap/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.138602 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_affd9c16-d0c4-4c54-b438-bdb3a4cafdd8/galera/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.159669 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a010cac3-4f37-4ffd-8627-5329e566d91a/nova-metadata-metadata/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.303806 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_32ad9006-53eb-4a4f-8ec0-8c287231374e/openstackclient/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.397157 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-x4sdg_01b52e65-ccad-48ac-91d0-b5b9fb3905cd/openstack-network-exporter/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.505401 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server-init/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.753665 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server-init/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.763603 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovs-vswitchd/0.log" Dec 03 15:15:25 crc kubenswrapper[5004]: I1203 15:15:25.804522 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-65kf4_038a0c7b-ce5f-481a-b716-e6b5f3077655/ovsdb-server/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.355337 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zdz2r_9cf66a90-3f7d-4170-8dab-9ff58ba576a3/ovn-controller/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.465290 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zkt82_122d652b-2c6a-4aa2-9303-e844922d4620/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.557936 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9af36c08-ab5c-4a97-88d3-a7ef2f032faf/openstack-network-exporter/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.686296 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9af36c08-ab5c-4a97-88d3-a7ef2f032faf/ovn-northd/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.703930 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_116764ff-36a4-444f-8051-e93b94a548fd/openstack-network-exporter/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.826732 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_116764ff-36a4-444f-8051-e93b94a548fd/ovsdbserver-nb/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.887231 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4d5df9d0-5ee6-4981-86b4-e90415206ceb/openstack-network-exporter/0.log" Dec 03 15:15:26 crc kubenswrapper[5004]: I1203 15:15:26.961816 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4d5df9d0-5ee6-4981-86b4-e90415206ceb/ovsdbserver-sb/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.241239 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ccd4cc976-4jrqc_a6f2bf21-eade-495e-99bb-4d12b3c46c3b/placement-api/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.261647 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ccd4cc976-4jrqc_a6f2bf21-eade-495e-99bb-4d12b3c46c3b/placement-log/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.337193 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/setup-container/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.554060 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/setup-container/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.635011 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8febd608-4e34-4b42-bcf7-27dbf88b7a09/rabbitmq/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.706560 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/setup-container/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.848777 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/setup-container/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.866991 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10a8bdc-f17c-4090-8c82-dcce9b638577/rabbitmq/0.log" Dec 03 15:15:27 crc kubenswrapper[5004]: I1203 15:15:27.932631 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6pm68_0c169631-8cd9-45a0-b295-026cd99d6e41/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.326680 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4skhq_5c867273-ae64-48f8-85f1-4eb5624b9dea/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.356169 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-62mbh_fada131d-446d-4819-b137-48910402240f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.543265 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6j48p_35e88acc-36ab-41a3-ab34-a04a3a4234de/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.613064 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.618076 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4qr7k_48017895-b32f-4afa-a7bf-e7e41c29d256/ssh-known-hosts-edpm-deployment/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.962229 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b9d87dc5f-trzlj_37c9311f-7b12-474a-ba76-c7c534f55e55/proxy-server/0.log" Dec 03 15:15:28 crc kubenswrapper[5004]: I1203 15:15:28.966534 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b9d87dc5f-trzlj_37c9311f-7b12-474a-ba76-c7c534f55e55/proxy-httpd/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.040321 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7xgl7_f9137320-4b52-422f-a96b-34c555c55aa6/swift-ring-rebalance/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.207071 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-reaper/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.207172 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-auditor/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.302066 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-replicator/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.464318 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-auditor/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.465689 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/account-server/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.487076 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-server/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.487180 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-replicator/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.644218 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/container-updater/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.696977 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-expirer/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.744116 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-auditor/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.808369 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-replicator/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.849917 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66"} Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.920623 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-server/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.972325 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/object-updater/0.log" Dec 03 15:15:29 crc kubenswrapper[5004]: I1203 15:15:29.989908 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/rsync/0.log" Dec 03 15:15:30 crc kubenswrapper[5004]: I1203 15:15:30.044952 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b45d92a5-2abb-421d-826f-185ac63f4661/swift-recon-cron/0.log" Dec 03 15:15:30 crc kubenswrapper[5004]: I1203 15:15:30.235932 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xx47r_cf32991a-bf4f-4ce6-9d01-3b75e2108b9f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:30 crc kubenswrapper[5004]: I1203 15:15:30.303530 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_be771b30-f62b-4d18-977a-2c0d6ecca56a/tempest-tests-tempest-tests-runner/0.log" Dec 03 15:15:30 crc kubenswrapper[5004]: I1203 15:15:30.491527 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8fbccb0-338d-4e17-915a-25d07c3491c9/test-operator-logs-container/0.log" Dec 03 15:15:30 crc kubenswrapper[5004]: I1203 15:15:30.534139 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kftm2_8f5d5c71-22c1-4bd4-a95d-8865928a48c3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 15:15:38 crc kubenswrapper[5004]: I1203 15:15:38.739361 5004 scope.go:117] "RemoveContainer" containerID="bafd4d869393dc00f10a3311f85e90fca74e2a38ff304aeb4d32122162db11ba" Dec 03 15:15:39 crc kubenswrapper[5004]: I1203 15:15:39.064241 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3bfe51e0-df6b-446f-9647-d9165f3cdead/memcached/0.log" Dec 03 15:15:57 crc kubenswrapper[5004]: I1203 15:15:57.908419 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.059674 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.084488 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.118563 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.282432 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/pull/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.294573 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/extract/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.302260 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7d66292acdd2c2b4ab2b753c4ba64df4bb01b751001686fc66a13c8066x9sq8_11c96b2b-489e-47dd-9a49-30ee58d31916/util/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.471980 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-l4b9j_34247b31-24ab-4386-8bf1-f0bfa7df6f00/kube-rbac-proxy/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.491502 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qffz8_04d75592-adf5-42b6-a02e-0074674b393d/kube-rbac-proxy/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.541807 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-l4b9j_34247b31-24ab-4386-8bf1-f0bfa7df6f00/manager/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.700676 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qffz8_04d75592-adf5-42b6-a02e-0074674b393d/manager/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.729400 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-97hkh_e44a1a8b-fd83-478f-9095-73e2f82ed81c/manager/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.731268 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-97hkh_e44a1a8b-fd83-478f-9095-73e2f82ed81c/kube-rbac-proxy/0.log" Dec 03 15:15:58 crc kubenswrapper[5004]: I1203 15:15:58.905884 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4lhqd_271911f5-3a7c-448b-976d-268c5b19edc1/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.000184 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4lhqd_271911f5-3a7c-448b-976d-268c5b19edc1/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.075770 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-thtjj_9be3a985-7677-4334-b270-386feb954a5c/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.088227 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-thtjj_9be3a985-7677-4334-b270-386feb954a5c/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.205038 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lprd2_f10a5021-1caf-47ba-8dce-51021a641f4c/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.259606 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-lprd2_f10a5021-1caf-47ba-8dce-51021a641f4c/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.423585 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pv9cw_3741a6af-989d-47ac-a6ee-a6443a4f2883/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.505442 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-6fqgd_419e5e47-1866-473a-a668-2fee54cb76ce/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.594005 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pv9cw_3741a6af-989d-47ac-a6ee-a6443a4f2883/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.610767 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-6fqgd_419e5e47-1866-473a-a668-2fee54cb76ce/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.750994 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-nwtth_3c18cd5e-8d20-4a2b-a62c-d141de1fc38a/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.859723 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-nwtth_3c18cd5e-8d20-4a2b-a62c-d141de1fc38a/manager/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.893222 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4ck5g_b70998ef-a4ea-49a9-922d-d7ad70346932/kube-rbac-proxy/0.log" Dec 03 15:15:59 crc kubenswrapper[5004]: I1203 15:15:59.915010 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4ck5g_b70998ef-a4ea-49a9-922d-d7ad70346932/manager/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.066042 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-kffh7_a38ab130-8698-49c3-bf30-355f88bcdc45/kube-rbac-proxy/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.123060 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-kffh7_a38ab130-8698-49c3-bf30-355f88bcdc45/manager/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.224978 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4b4kl_f35c5faa-53cc-4829-91a0-1c422eae75f6/kube-rbac-proxy/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.293282 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-4b4kl_f35c5faa-53cc-4829-91a0-1c422eae75f6/manager/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.357271 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-st4w4_dd7dec16-458d-46f6-9ee6-b0db6551792a/kube-rbac-proxy/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.457124 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-st4w4_dd7dec16-458d-46f6-9ee6-b0db6551792a/manager/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.528556 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gndmp_a1d5cb2a-85a6-4ff0-a9cf-519397479d2c/kube-rbac-proxy/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.553593 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gndmp_a1d5cb2a-85a6-4ff0-a9cf-519397479d2c/manager/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.681147 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd46zspw_7dbdc2c5-5e0c-4315-b836-1acacf93df2d/kube-rbac-proxy/0.log" Dec 03 15:16:00 crc kubenswrapper[5004]: I1203 15:16:00.724139 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd46zspw_7dbdc2c5-5e0c-4315-b836-1acacf93df2d/manager/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.075367 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d6f666fbc-smswg_1d0cab62-1f81-48c0-a3b3-3a774fcd7b18/operator/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.206046 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-x4d5l_dc67c749-644a-416d-8f75-ebd340795204/registry-server/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.305131 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-td552_2aab9a50-58d3-4eba-8589-c009d3b2b604/kube-rbac-proxy/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.469235 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-td552_2aab9a50-58d3-4eba-8589-c009d3b2b604/manager/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.618528 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-5z95c_90f6b1a6-2cd1-4649-b794-e00f64cd80cb/kube-rbac-proxy/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.784471 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-5z95c_90f6b1a6-2cd1-4649-b794-e00f64cd80cb/manager/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.836115 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gzd52_65687c7c-1b6d-485f-b99c-41706846c7a7/operator/0.log" Dec 03 15:16:01 crc kubenswrapper[5004]: I1203 15:16:01.926423 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8xwrm_9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b/kube-rbac-proxy/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.021537 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-79c58f7d4-4qmpw_a3fd1093-3e64-4558-9314-355dbf1c8a8c/manager/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.033076 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-8xwrm_9bec5a93-cc9c-4f46-8ecc-dcdde9f9023b/manager/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.057374 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8bnn2_206c7f05-3575-400e-a37b-ba608f159fc5/kube-rbac-proxy/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.176213 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8bnn2_206c7f05-3575-400e-a37b-ba608f159fc5/manager/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.179364 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-92pmx_bf9d689f-bfab-4b05-9b08-d855836a7846/kube-rbac-proxy/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.229388 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-92pmx_bf9d689f-bfab-4b05-9b08-d855836a7846/manager/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.363530 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rlxc5_298e9f66-a005-42bd-b2f6-4653a88e0177/manager/0.log" Dec 03 15:16:02 crc kubenswrapper[5004]: I1203 15:16:02.367675 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-rlxc5_298e9f66-a005-42bd-b2f6-4653a88e0177/kube-rbac-proxy/0.log" Dec 03 15:16:22 crc kubenswrapper[5004]: I1203 15:16:22.404095 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mdbfw_a887d450-ffa8-4b30-98db-2e223c46b134/kube-rbac-proxy/0.log" Dec 03 15:16:22 crc kubenswrapper[5004]: I1203 15:16:22.509455 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-swmpz_f986649e-61c8-4c67-beb3-edc5dc4e4fd9/control-plane-machine-set-operator/0.log" Dec 03 15:16:22 crc kubenswrapper[5004]: I1203 15:16:22.676137 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mdbfw_a887d450-ffa8-4b30-98db-2e223c46b134/machine-api-operator/0.log" Dec 03 15:16:35 crc kubenswrapper[5004]: I1203 15:16:35.907127 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4rv6q_0399bdc2-ceca-49e1-a00b-a8685a860ebe/cert-manager-controller/0.log" Dec 03 15:16:36 crc kubenswrapper[5004]: I1203 15:16:36.031579 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hjmk5_6b108801-1198-420f-ab57-dea765daf047/cert-manager-cainjector/0.log" Dec 03 15:16:36 crc kubenswrapper[5004]: I1203 15:16:36.080092 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-swvh8_a64714f4-8d4b-4101-bf1c-d953cddb3f08/cert-manager-webhook/0.log" Dec 03 15:16:49 crc kubenswrapper[5004]: I1203 15:16:49.754471 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-pgnkx_3c3279d4-50fa-454f-993b-ce1d1aa33140/nmstate-console-plugin/0.log" Dec 03 15:16:50 crc kubenswrapper[5004]: I1203 15:16:50.002371 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mvxhn_4efdf8bb-b98f-4afa-a605-0bb57c93b999/kube-rbac-proxy/0.log" Dec 03 15:16:50 crc kubenswrapper[5004]: I1203 15:16:50.025426 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hdbk9_ba4eef2f-7208-44fd-b116-6f394cf2c7e2/nmstate-handler/0.log" Dec 03 15:16:50 crc kubenswrapper[5004]: I1203 15:16:50.139281 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mvxhn_4efdf8bb-b98f-4afa-a605-0bb57c93b999/nmstate-metrics/0.log" Dec 03 15:16:50 crc kubenswrapper[5004]: I1203 15:16:50.190897 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-pf52k_5a1faf91-6fda-4e62-801d-bb1624d95274/nmstate-operator/0.log" Dec 03 15:16:50 crc kubenswrapper[5004]: I1203 15:16:50.342790 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-xbgh5_c005e57a-6449-4c48-a81c-deda46fc3d02/nmstate-webhook/0.log" Dec 03 15:17:05 crc kubenswrapper[5004]: I1203 15:17:05.804362 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vfwkq_0af74282-be81-45a2-966a-4dcb279d7c6a/kube-rbac-proxy/0.log" Dec 03 15:17:05 crc kubenswrapper[5004]: I1203 15:17:05.905647 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vfwkq_0af74282-be81-45a2-966a-4dcb279d7c6a/controller/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.015813 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.180453 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.188685 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.194024 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.216844 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.380344 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.404011 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.408533 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.457618 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.639251 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-metrics/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.659615 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-reloader/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.668704 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/cp-frr-files/0.log" Dec 03 15:17:06 crc kubenswrapper[5004]: I1203 15:17:06.704332 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/controller/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.238272 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/kube-rbac-proxy/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.254446 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/frr-metrics/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.265511 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/kube-rbac-proxy-frr/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.463292 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-jksx8_b1856a9d-f833-48a2-941b-8c9fd3f06416/frr-k8s-webhook-server/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.500225 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/reloader/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.708423 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b7864bd46-w9w6m_6aa91ebc-2da0-4d5e-9847-d5f2758e72e5/manager/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.941694 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8c7f47999-twv92_8d864da6-31ee-490f-b4e8-568f95a96ff0/webhook-server/0.log" Dec 03 15:17:07 crc kubenswrapper[5004]: I1203 15:17:07.955551 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6kv8g_27227413-e203-4218-942d-35c1493b7015/kube-rbac-proxy/0.log" Dec 03 15:17:08 crc kubenswrapper[5004]: I1203 15:17:08.545765 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6kv8g_27227413-e203-4218-942d-35c1493b7015/speaker/0.log" Dec 03 15:17:08 crc kubenswrapper[5004]: I1203 15:17:08.729645 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9r8qk_eba563db-6e27-423d-8739-ea22c19318ac/frr/0.log" Dec 03 15:17:20 crc kubenswrapper[5004]: I1203 15:17:20.915475 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.113462 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.114394 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.130401 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.277429 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.284172 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.290095 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fzxfgm_2328dd14-b73c-45d2-9ea7-bfb5c246e262/extract/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.456621 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.623142 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.646516 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.647354 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.833681 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/util/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.851321 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/extract/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.864058 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83n8jtr_e3c17e2f-5008-49fa-9e86-3d63c506af53/pull/0.log" Dec 03 15:17:21 crc kubenswrapper[5004]: I1203 15:17:21.989976 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.148141 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.158981 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.190450 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.360358 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-utilities/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.366211 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/extract-content/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.544835 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.744921 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.810552 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:17:22 crc kubenswrapper[5004]: I1203 15:17:22.850690 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.007085 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-utilities/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.060559 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkvg7_80fbd159-953d-4ede-956d-d40239fae0f0/registry-server/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.074497 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/extract-content/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.281099 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x2qkd_cca7b643-a679-4b89-b42d-a18c552a737b/marketplace-operator/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.453532 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.678263 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.731153 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.758775 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2mkgc_00273e2c-88dd-479a-a5e1-7791a7d0cb30/registry-server/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.899718 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:17:23 crc kubenswrapper[5004]: I1203 15:17:23.957074 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-content/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.007516 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/extract-utilities/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.172142 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-g99tq_8a6b3e02-1dfc-4967-809d-9bc9a2176fd4/registry-server/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.204359 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.391939 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.396023 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.453644 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.615117 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-utilities/0.log" Dec 03 15:17:24 crc kubenswrapper[5004]: I1203 15:17:24.618081 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/extract-content/0.log" Dec 03 15:17:25 crc kubenswrapper[5004]: I1203 15:17:25.079561 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9qzr_5fd4fd02-ca91-407b-8558-9a0250a7851c/registry-server/0.log" Dec 03 15:17:52 crc kubenswrapper[5004]: I1203 15:17:52.824515 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:17:52 crc kubenswrapper[5004]: I1203 15:17:52.825144 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:18:22 crc kubenswrapper[5004]: I1203 15:18:22.824879 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:18:22 crc kubenswrapper[5004]: I1203 15:18:22.825618 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:18:52 crc kubenswrapper[5004]: I1203 15:18:52.824896 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:18:52 crc kubenswrapper[5004]: I1203 15:18:52.825435 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:18:52 crc kubenswrapper[5004]: I1203 15:18:52.825478 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 15:18:52 crc kubenswrapper[5004]: I1203 15:18:52.826170 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:18:52 crc kubenswrapper[5004]: I1203 15:18:52.826217 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66" gracePeriod=600 Dec 03 15:18:53 crc kubenswrapper[5004]: I1203 15:18:53.847313 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66" exitCode=0 Dec 03 15:18:53 crc kubenswrapper[5004]: I1203 15:18:53.847517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66"} Dec 03 15:18:53 crc kubenswrapper[5004]: I1203 15:18:53.847908 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerStarted","Data":"98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126"} Dec 03 15:18:53 crc kubenswrapper[5004]: I1203 15:18:53.847928 5004 scope.go:117] "RemoveContainer" containerID="aeb8cbf5b312752b103df2140b73246cfa351a43157027bb4ee019ce5e4c6a4c" Dec 03 15:19:07 crc kubenswrapper[5004]: I1203 15:19:07.990014 5004 generic.go:334] "Generic (PLEG): container finished" podID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerID="8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069" exitCode=0 Dec 03 15:19:07 crc kubenswrapper[5004]: I1203 15:19:07.990109 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" event={"ID":"e024fbff-5e92-4fc3-b5d8-31a69957a91f","Type":"ContainerDied","Data":"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069"} Dec 03 15:19:07 crc kubenswrapper[5004]: I1203 15:19:07.991571 5004 scope.go:117] "RemoveContainer" containerID="8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069" Dec 03 15:19:08 crc kubenswrapper[5004]: I1203 15:19:08.168086 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zlkfg_must-gather-xrhp6_e024fbff-5e92-4fc3-b5d8-31a69957a91f/gather/0.log" Dec 03 15:19:19 crc kubenswrapper[5004]: I1203 15:19:19.137546 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zlkfg/must-gather-xrhp6"] Dec 03 15:19:19 crc kubenswrapper[5004]: I1203 15:19:19.138553 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="copy" containerID="cri-o://50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a" gracePeriod=2 Dec 03 15:19:19 crc kubenswrapper[5004]: I1203 15:19:19.149343 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zlkfg/must-gather-xrhp6"] Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.580918 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zlkfg_must-gather-xrhp6_e024fbff-5e92-4fc3-b5d8-31a69957a91f/copy/0.log" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.582195 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.726528 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2qth\" (UniqueName: \"kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth\") pod \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.726646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output\") pod \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\" (UID: \"e024fbff-5e92-4fc3-b5d8-31a69957a91f\") " Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.756807 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth" (OuterVolumeSpecName: "kube-api-access-z2qth") pod "e024fbff-5e92-4fc3-b5d8-31a69957a91f" (UID: "e024fbff-5e92-4fc3-b5d8-31a69957a91f"). InnerVolumeSpecName "kube-api-access-z2qth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.828990 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2qth\" (UniqueName: \"kubernetes.io/projected/e024fbff-5e92-4fc3-b5d8-31a69957a91f-kube-api-access-z2qth\") on node \"crc\" DevicePath \"\"" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:19.949524 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e024fbff-5e92-4fc3-b5d8-31a69957a91f" (UID: "e024fbff-5e92-4fc3-b5d8-31a69957a91f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.032620 5004 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e024fbff-5e92-4fc3-b5d8-31a69957a91f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.099549 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zlkfg_must-gather-xrhp6_e024fbff-5e92-4fc3-b5d8-31a69957a91f/copy/0.log" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.099927 5004 generic.go:334] "Generic (PLEG): container finished" podID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerID="50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a" exitCode=143 Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.099980 5004 scope.go:117] "RemoveContainer" containerID="50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.100120 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zlkfg/must-gather-xrhp6" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.126309 5004 scope.go:117] "RemoveContainer" containerID="8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.191436 5004 scope.go:117] "RemoveContainer" containerID="50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a" Dec 03 15:19:20 crc kubenswrapper[5004]: E1203 15:19:20.191987 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a\": container with ID starting with 50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a not found: ID does not exist" containerID="50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.192021 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a"} err="failed to get container status \"50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a\": rpc error: code = NotFound desc = could not find container \"50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a\": container with ID starting with 50ec7277e5044fdf7b6776c0e4366c5f084a38aa8e8d87cc2674eaa0819d296a not found: ID does not exist" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.192077 5004 scope.go:117] "RemoveContainer" containerID="8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069" Dec 03 15:19:20 crc kubenswrapper[5004]: E1203 15:19:20.192578 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069\": container with ID starting with 8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069 not found: ID does not exist" containerID="8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069" Dec 03 15:19:20 crc kubenswrapper[5004]: I1203 15:19:20.192637 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069"} err="failed to get container status \"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069\": rpc error: code = NotFound desc = could not find container \"8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069\": container with ID starting with 8cf58307a61e2ca14771527d7adb949e8376a1b3d7696b81e9da9e5bf90af069 not found: ID does not exist" Dec 03 15:19:21 crc kubenswrapper[5004]: I1203 15:19:21.627124 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" path="/var/lib/kubelet/pods/e024fbff-5e92-4fc3-b5d8-31a69957a91f/volumes" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.494650 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:19:54 crc kubenswrapper[5004]: E1203 15:19:54.495916 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddb107b-d7e9-405f-b79c-d7abadf7735b" containerName="collect-profiles" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.495940 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddb107b-d7e9-405f-b79c-d7abadf7735b" containerName="collect-profiles" Dec 03 15:19:54 crc kubenswrapper[5004]: E1203 15:19:54.495992 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="gather" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.496005 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="gather" Dec 03 15:19:54 crc kubenswrapper[5004]: E1203 15:19:54.496027 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="copy" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.496039 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="copy" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.496395 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fddb107b-d7e9-405f-b79c-d7abadf7735b" containerName="collect-profiles" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.496420 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="copy" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.496435 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e024fbff-5e92-4fc3-b5d8-31a69957a91f" containerName="gather" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.499058 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.525453 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.593846 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9f4\" (UniqueName: \"kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.593978 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.594022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.696239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9f4\" (UniqueName: \"kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.696324 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.696353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.696882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.696908 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.718071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9f4\" (UniqueName: \"kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4\") pod \"redhat-marketplace-q86rp\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:54 crc kubenswrapper[5004]: I1203 15:19:54.849272 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:19:55 crc kubenswrapper[5004]: I1203 15:19:55.319887 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:19:55 crc kubenswrapper[5004]: I1203 15:19:55.463388 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerStarted","Data":"5015a2c566f960c88aa0d2eaa8751426a6e8f0b6b432fb02316efaa421264a49"} Dec 03 15:19:56 crc kubenswrapper[5004]: I1203 15:19:56.472727 5004 generic.go:334] "Generic (PLEG): container finished" podID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerID="515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79" exitCode=0 Dec 03 15:19:56 crc kubenswrapper[5004]: I1203 15:19:56.472833 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerDied","Data":"515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79"} Dec 03 15:19:56 crc kubenswrapper[5004]: I1203 15:19:56.474678 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 15:19:57 crc kubenswrapper[5004]: I1203 15:19:57.493378 5004 generic.go:334] "Generic (PLEG): container finished" podID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerID="f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1" exitCode=0 Dec 03 15:19:57 crc kubenswrapper[5004]: I1203 15:19:57.493673 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerDied","Data":"f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1"} Dec 03 15:19:59 crc kubenswrapper[5004]: I1203 15:19:59.512173 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerStarted","Data":"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1"} Dec 03 15:19:59 crc kubenswrapper[5004]: I1203 15:19:59.534919 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q86rp" podStartSLOduration=4.098233367 podStartE2EDuration="5.534897408s" podCreationTimestamp="2025-12-03 15:19:54 +0000 UTC" firstStartedPulling="2025-12-03 15:19:56.474461907 +0000 UTC m=+4409.223432143" lastFinishedPulling="2025-12-03 15:19:57.911125948 +0000 UTC m=+4410.660096184" observedRunningTime="2025-12-03 15:19:59.526484601 +0000 UTC m=+4412.275454837" watchObservedRunningTime="2025-12-03 15:19:59.534897408 +0000 UTC m=+4412.283867644" Dec 03 15:20:04 crc kubenswrapper[5004]: I1203 15:20:04.849813 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:04 crc kubenswrapper[5004]: I1203 15:20:04.850448 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:04 crc kubenswrapper[5004]: I1203 15:20:04.898137 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:05 crc kubenswrapper[5004]: I1203 15:20:05.625762 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:05 crc kubenswrapper[5004]: I1203 15:20:05.683100 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:20:07 crc kubenswrapper[5004]: I1203 15:20:07.585796 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q86rp" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="registry-server" containerID="cri-o://0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1" gracePeriod=2 Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.057528 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.161374 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9f4\" (UniqueName: \"kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4\") pod \"a416b51b-93de-4168-9c96-95597b1f9dfe\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.161625 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities\") pod \"a416b51b-93de-4168-9c96-95597b1f9dfe\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.161674 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content\") pod \"a416b51b-93de-4168-9c96-95597b1f9dfe\" (UID: \"a416b51b-93de-4168-9c96-95597b1f9dfe\") " Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.162508 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities" (OuterVolumeSpecName: "utilities") pod "a416b51b-93de-4168-9c96-95597b1f9dfe" (UID: "a416b51b-93de-4168-9c96-95597b1f9dfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.166695 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4" (OuterVolumeSpecName: "kube-api-access-ls9f4") pod "a416b51b-93de-4168-9c96-95597b1f9dfe" (UID: "a416b51b-93de-4168-9c96-95597b1f9dfe"). InnerVolumeSpecName "kube-api-access-ls9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.184674 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a416b51b-93de-4168-9c96-95597b1f9dfe" (UID: "a416b51b-93de-4168-9c96-95597b1f9dfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.263727 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.263764 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a416b51b-93de-4168-9c96-95597b1f9dfe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.263776 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9f4\" (UniqueName: \"kubernetes.io/projected/a416b51b-93de-4168-9c96-95597b1f9dfe-kube-api-access-ls9f4\") on node \"crc\" DevicePath \"\"" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.596018 5004 generic.go:334] "Generic (PLEG): container finished" podID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerID="0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1" exitCode=0 Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.596067 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerDied","Data":"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1"} Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.596118 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q86rp" event={"ID":"a416b51b-93de-4168-9c96-95597b1f9dfe","Type":"ContainerDied","Data":"5015a2c566f960c88aa0d2eaa8751426a6e8f0b6b432fb02316efaa421264a49"} Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.596121 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q86rp" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.596136 5004 scope.go:117] "RemoveContainer" containerID="0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.626993 5004 scope.go:117] "RemoveContainer" containerID="f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.645573 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.659558 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q86rp"] Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.672983 5004 scope.go:117] "RemoveContainer" containerID="515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.702822 5004 scope.go:117] "RemoveContainer" containerID="0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1" Dec 03 15:20:08 crc kubenswrapper[5004]: E1203 15:20:08.703296 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1\": container with ID starting with 0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1 not found: ID does not exist" containerID="0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.703414 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1"} err="failed to get container status \"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1\": rpc error: code = NotFound desc = could not find container \"0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1\": container with ID starting with 0e7fec31f7d1f26f4e5eebd7940874fcfe097cafe3f37080fe9916a410d149e1 not found: ID does not exist" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.703514 5004 scope.go:117] "RemoveContainer" containerID="f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1" Dec 03 15:20:08 crc kubenswrapper[5004]: E1203 15:20:08.703944 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1\": container with ID starting with f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1 not found: ID does not exist" containerID="f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.704033 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1"} err="failed to get container status \"f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1\": rpc error: code = NotFound desc = could not find container \"f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1\": container with ID starting with f234464e7fbda21349c73d0dd631ca9708bd4ab77f35487216dc5a3f8ed526e1 not found: ID does not exist" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.704106 5004 scope.go:117] "RemoveContainer" containerID="515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79" Dec 03 15:20:08 crc kubenswrapper[5004]: E1203 15:20:08.704453 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79\": container with ID starting with 515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79 not found: ID does not exist" containerID="515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79" Dec 03 15:20:08 crc kubenswrapper[5004]: I1203 15:20:08.704555 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79"} err="failed to get container status \"515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79\": rpc error: code = NotFound desc = could not find container \"515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79\": container with ID starting with 515ea74e77cce166f53d5d82d53a18e76d859c88235d7dbd494b847a89ecbb79 not found: ID does not exist" Dec 03 15:20:09 crc kubenswrapper[5004]: I1203 15:20:09.625348 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" path="/var/lib/kubelet/pods/a416b51b-93de-4168-9c96-95597b1f9dfe/volumes" Dec 03 15:20:38 crc kubenswrapper[5004]: I1203 15:20:38.950149 5004 scope.go:117] "RemoveContainer" containerID="3339af3466aa2d99e8c73e8a2122b4e74fd89f8465126a038e565357e1ea24cc" Dec 03 15:21:22 crc kubenswrapper[5004]: I1203 15:21:22.824813 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:21:22 crc kubenswrapper[5004]: I1203 15:21:22.825430 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:21:39 crc kubenswrapper[5004]: I1203 15:21:39.051411 5004 scope.go:117] "RemoveContainer" containerID="48f87d744aab4f1492061930a8d4c81417e3a39d111a2b1a02998ff0e2c40831" Dec 03 15:21:52 crc kubenswrapper[5004]: I1203 15:21:52.824493 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:21:52 crc kubenswrapper[5004]: I1203 15:21:52.825122 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:22:22 crc kubenswrapper[5004]: I1203 15:22:22.824023 5004 patch_prober.go:28] interesting pod/machine-config-daemon-m4g6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 15:22:22 crc kubenswrapper[5004]: I1203 15:22:22.824623 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 15:22:22 crc kubenswrapper[5004]: I1203 15:22:22.824676 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" Dec 03 15:22:22 crc kubenswrapper[5004]: I1203 15:22:22.825455 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126"} pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 15:22:22 crc kubenswrapper[5004]: I1203 15:22:22.825510 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerName="machine-config-daemon" containerID="cri-o://98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" gracePeriod=600 Dec 03 15:22:23 crc kubenswrapper[5004]: I1203 15:22:23.922732 5004 generic.go:334] "Generic (PLEG): container finished" podID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" exitCode=0 Dec 03 15:22:23 crc kubenswrapper[5004]: I1203 15:22:23.923117 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" event={"ID":"7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94","Type":"ContainerDied","Data":"98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126"} Dec 03 15:22:23 crc kubenswrapper[5004]: I1203 15:22:23.924259 5004 scope.go:117] "RemoveContainer" containerID="c3a4ed29630b3f73883742cb32ae61c342751d5c2206c350c91e4f4840b13e66" Dec 03 15:22:24 crc kubenswrapper[5004]: E1203 15:22:24.266344 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:22:24 crc kubenswrapper[5004]: I1203 15:22:24.935471 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:22:24 crc kubenswrapper[5004]: E1203 15:22:24.935899 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:22:36 crc kubenswrapper[5004]: I1203 15:22:36.613602 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:22:36 crc kubenswrapper[5004]: E1203 15:22:36.614429 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:22:50 crc kubenswrapper[5004]: I1203 15:22:50.613597 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:22:50 crc kubenswrapper[5004]: E1203 15:22:50.614697 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.769337 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:22:51 crc kubenswrapper[5004]: E1203 15:22:51.770190 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="registry-server" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.770207 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="registry-server" Dec 03 15:22:51 crc kubenswrapper[5004]: E1203 15:22:51.770220 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="extract-utilities" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.770228 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="extract-utilities" Dec 03 15:22:51 crc kubenswrapper[5004]: E1203 15:22:51.770253 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="extract-content" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.770280 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="extract-content" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.770532 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a416b51b-93de-4168-9c96-95597b1f9dfe" containerName="registry-server" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.772209 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.789418 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.909902 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d92k\" (UniqueName: \"kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.910039 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:51 crc kubenswrapper[5004]: I1203 15:22:51.910098 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.011723 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.011811 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d92k\" (UniqueName: \"kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.011962 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.012678 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.013010 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.046062 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d92k\" (UniqueName: \"kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k\") pod \"community-operators-c4tsn\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.101666 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:22:52 crc kubenswrapper[5004]: I1203 15:22:52.690380 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:22:53 crc kubenswrapper[5004]: I1203 15:22:53.206380 5004 generic.go:334] "Generic (PLEG): container finished" podID="fb44196e-0be1-44c9-a650-32747320b650" containerID="8047a9bc65ce4669fbc6e2c9382b117e051ce29696eb62e3809c8225e05abec5" exitCode=0 Dec 03 15:22:53 crc kubenswrapper[5004]: I1203 15:22:53.206478 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerDied","Data":"8047a9bc65ce4669fbc6e2c9382b117e051ce29696eb62e3809c8225e05abec5"} Dec 03 15:22:53 crc kubenswrapper[5004]: I1203 15:22:53.206752 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerStarted","Data":"6297081919f15927b8547255052074369db0f1b7787ac84f76ab5a6863d69913"} Dec 03 15:22:54 crc kubenswrapper[5004]: I1203 15:22:54.233942 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerStarted","Data":"e48730e4b4bb14108b559d995820d5942cc8707e6535db7ff3714aa80b0999e3"} Dec 03 15:22:55 crc kubenswrapper[5004]: I1203 15:22:55.245603 5004 generic.go:334] "Generic (PLEG): container finished" podID="fb44196e-0be1-44c9-a650-32747320b650" containerID="e48730e4b4bb14108b559d995820d5942cc8707e6535db7ff3714aa80b0999e3" exitCode=0 Dec 03 15:22:55 crc kubenswrapper[5004]: I1203 15:22:55.245651 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerDied","Data":"e48730e4b4bb14108b559d995820d5942cc8707e6535db7ff3714aa80b0999e3"} Dec 03 15:22:56 crc kubenswrapper[5004]: I1203 15:22:56.256011 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerStarted","Data":"c2665a55375c383486715db589f1202cfabf8bd7cc4800d3e2529de3324053dd"} Dec 03 15:22:56 crc kubenswrapper[5004]: I1203 15:22:56.286160 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c4tsn" podStartSLOduration=2.827124036 podStartE2EDuration="5.286129741s" podCreationTimestamp="2025-12-03 15:22:51 +0000 UTC" firstStartedPulling="2025-12-03 15:22:53.208143786 +0000 UTC m=+4585.957114022" lastFinishedPulling="2025-12-03 15:22:55.667149471 +0000 UTC m=+4588.416119727" observedRunningTime="2025-12-03 15:22:56.275914184 +0000 UTC m=+4589.024884440" watchObservedRunningTime="2025-12-03 15:22:56.286129741 +0000 UTC m=+4589.035100017" Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.102189 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.103350 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.150762 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.371069 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.438753 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:23:02 crc kubenswrapper[5004]: I1203 15:23:02.613563 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:23:02 crc kubenswrapper[5004]: E1203 15:23:02.613846 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:23:04 crc kubenswrapper[5004]: I1203 15:23:04.325129 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c4tsn" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="registry-server" containerID="cri-o://c2665a55375c383486715db589f1202cfabf8bd7cc4800d3e2529de3324053dd" gracePeriod=2 Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.359384 5004 generic.go:334] "Generic (PLEG): container finished" podID="fb44196e-0be1-44c9-a650-32747320b650" containerID="c2665a55375c383486715db589f1202cfabf8bd7cc4800d3e2529de3324053dd" exitCode=0 Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.359475 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerDied","Data":"c2665a55375c383486715db589f1202cfabf8bd7cc4800d3e2529de3324053dd"} Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.551203 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.687410 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d92k\" (UniqueName: \"kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k\") pod \"fb44196e-0be1-44c9-a650-32747320b650\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.687460 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities\") pod \"fb44196e-0be1-44c9-a650-32747320b650\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.687661 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content\") pod \"fb44196e-0be1-44c9-a650-32747320b650\" (UID: \"fb44196e-0be1-44c9-a650-32747320b650\") " Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.689322 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities" (OuterVolumeSpecName: "utilities") pod "fb44196e-0be1-44c9-a650-32747320b650" (UID: "fb44196e-0be1-44c9-a650-32747320b650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.708609 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k" (OuterVolumeSpecName: "kube-api-access-4d92k") pod "fb44196e-0be1-44c9-a650-32747320b650" (UID: "fb44196e-0be1-44c9-a650-32747320b650"). InnerVolumeSpecName "kube-api-access-4d92k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.747495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb44196e-0be1-44c9-a650-32747320b650" (UID: "fb44196e-0be1-44c9-a650-32747320b650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.790304 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.790349 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d92k\" (UniqueName: \"kubernetes.io/projected/fb44196e-0be1-44c9-a650-32747320b650-kube-api-access-4d92k\") on node \"crc\" DevicePath \"\"" Dec 03 15:23:05 crc kubenswrapper[5004]: I1203 15:23:05.790364 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb44196e-0be1-44c9-a650-32747320b650-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.380999 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4tsn" event={"ID":"fb44196e-0be1-44c9-a650-32747320b650","Type":"ContainerDied","Data":"6297081919f15927b8547255052074369db0f1b7787ac84f76ab5a6863d69913"} Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.381401 5004 scope.go:117] "RemoveContainer" containerID="c2665a55375c383486715db589f1202cfabf8bd7cc4800d3e2529de3324053dd" Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.381555 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4tsn" Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.417939 5004 scope.go:117] "RemoveContainer" containerID="e48730e4b4bb14108b559d995820d5942cc8707e6535db7ff3714aa80b0999e3" Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.426190 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.455241 5004 scope.go:117] "RemoveContainer" containerID="8047a9bc65ce4669fbc6e2c9382b117e051ce29696eb62e3809c8225e05abec5" Dec 03 15:23:06 crc kubenswrapper[5004]: I1203 15:23:06.457133 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c4tsn"] Dec 03 15:23:07 crc kubenswrapper[5004]: I1203 15:23:07.630180 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb44196e-0be1-44c9-a650-32747320b650" path="/var/lib/kubelet/pods/fb44196e-0be1-44c9-a650-32747320b650/volumes" Dec 03 15:23:14 crc kubenswrapper[5004]: I1203 15:23:14.613590 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:23:14 crc kubenswrapper[5004]: E1203 15:23:14.614542 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:23:27 crc kubenswrapper[5004]: I1203 15:23:27.625127 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:23:27 crc kubenswrapper[5004]: E1203 15:23:27.625725 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:23:38 crc kubenswrapper[5004]: I1203 15:23:38.614042 5004 scope.go:117] "RemoveContainer" containerID="98fd6fc49347ddd9052982d7e09915ce6acef71bc9dd1f64581a312dbee3b126" Dec 03 15:23:38 crc kubenswrapper[5004]: E1203 15:23:38.615021 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m4g6v_openshift-machine-config-operator(7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94)\"" pod="openshift-machine-config-operator/machine-config-daemon-m4g6v" podUID="7c6cf6ea-c7f7-44bb-b1fa-9e8d5f1d9c94" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.278777 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-949ll"] Dec 03 15:23:50 crc kubenswrapper[5004]: E1203 15:23:50.280002 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="extract-utilities" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.280018 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="extract-utilities" Dec 03 15:23:50 crc kubenswrapper[5004]: E1203 15:23:50.280044 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="registry-server" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.280052 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="registry-server" Dec 03 15:23:50 crc kubenswrapper[5004]: E1203 15:23:50.280091 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="extract-content" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.280099 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="extract-content" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.280321 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb44196e-0be1-44c9-a650-32747320b650" containerName="registry-server" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.281659 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.294885 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-949ll"] Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.436965 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-utilities\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.437017 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5zg\" (UniqueName: \"kubernetes.io/projected/1a2cc9fe-d418-424b-9b2f-95ac611463dc-kube-api-access-sl5zg\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.437066 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-catalog-content\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.540130 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-utilities\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.540706 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5zg\" (UniqueName: \"kubernetes.io/projected/1a2cc9fe-d418-424b-9b2f-95ac611463dc-kube-api-access-sl5zg\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.540736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-utilities\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.543180 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-catalog-content\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.544132 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a2cc9fe-d418-424b-9b2f-95ac611463dc-catalog-content\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.568086 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5zg\" (UniqueName: \"kubernetes.io/projected/1a2cc9fe-d418-424b-9b2f-95ac611463dc-kube-api-access-sl5zg\") pod \"certified-operators-949ll\" (UID: \"1a2cc9fe-d418-424b-9b2f-95ac611463dc\") " pod="openshift-marketplace/certified-operators-949ll" Dec 03 15:23:50 crc kubenswrapper[5004]: I1203 15:23:50.604256 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-949ll"